You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/09/11 12:45:53 UTC

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #86

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/86/display/redirect?page=changes>

Changes:

[noreply] Added type annotations to some combiners missing it. (#15414)

[noreply] [BEAM-12634] JmsIO auto scaling feature (#15464)

[noreply] [BEAM-12662] Get Flink version from cluster. (#15223)

[noreply] Port changes from Pub/Sub Lite to beam (#15418)

[heejong] [BEAM-12805] Fix XLang CombinePerKey test by explicitly assigning the

[BenWhitehead] [BEAM-8376] Google Cloud Firestore Connector - Add handling for

[noreply] Decreasing peak memory usage for beam.TupleCombineFn (#15494)

[noreply] [BEAM-12802] Add support for prefetch through data layers down through

[noreply] [BEAM-11097] Add implementation of side input cache (#15483)


------------------------------------------
[...truncated 41.27 KB...]
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:java11:copySdkHarnessLauncher
> Task :sdks:java:container:generateLicenseReport

> Task :sdks:java:container:pullLicenses
Copying already-fetched licenses from <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license> to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...
done.
DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will drop support for Python 3.5 in January 2021. pip 21.0 will remove support for this functionality.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
Collecting future<1.0.0,>=0.16.0
  Using cached future-0.18.2-py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: soupsieve, six, tenacity, pyyaml, future, beautifulsoup4
Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 six-1.16.0 soupsieve-2.1 tenacity-5.1.5
Executing <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python> <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py> --license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json>        --output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>        --dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> 
INFO:root:Pulling license for 207 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 1.735595 seconds with 16 threads.
Copying licenses from <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses> to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.>
Finished license_scripts.sh

> Task :sdks:java:container:java11:copyJavaThirdPartyLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container --save_path=/output/licenses
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
94155687fd15: Preparing
1a6154aca6a5: Preparing
4db553e1667a: Preparing
1db591f28d49: Preparing
e08ddbdc02fc: Preparing
010d17a9161d: Preparing
b34dc0cc89f9: Preparing
3b95ff4a5a2f: Preparing
cbaa10add834: Preparing
c09300ef95e5: Preparing
1c88a5e3119e: Preparing
1e87170fa4e6: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
c09300ef95e5: Waiting
1c88a5e3119e: Waiting
1e87170fa4e6: Waiting
010d17a9161d: Waiting
b34dc0cc89f9: Waiting
3891808a925b: Waiting
3b95ff4a5a2f: Waiting
8555e663f65b: Waiting
d00da3cd7763: Waiting
00ef5416d927: Waiting
cbaa10add834: Waiting
4e61e63529c2: Waiting
d402f4f1b906: Waiting
799760671c38: Waiting
1a6154aca6a5: Pushed
4db553e1667a: Pushed
e08ddbdc02fc: Pushed
010d17a9161d: Pushed
94155687fd15: Pushed
3b95ff4a5a2f: Pushed
1db591f28d49: Pushed
cbaa10add834: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
1c88a5e3119e: Pushed
00ef5416d927: Layer already exists
b34dc0cc89f9: Pushed
1e87170fa4e6: Pushed
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
c09300ef95e5: Pushed
20210911124335: digest: sha256:ec8a6b706ab30a1f4484e5b634dffcda5a41d1ce4d0225ad66f82959c24f0953 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 11, 2021 12:45:23 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 11, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 11, 2021 12:45:24 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 11, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 11, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 53f1ff7280f0aaa797c7ce909826ae925c0fbce7a9dce4b552c7139927065582> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-U_H_coDwqqeXx86QmCauklwPvOep3OS1UscTmScGVYI.pb
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 11, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f254608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d]
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 11, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60d40ff4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0]
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 11, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_05_45_29-2969291427559089470?project=apache-beam-testing
Sep 11, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-11_05_45_29-2969291427559089470
Sep 11, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-11_05_45_29-2969291427559089470
Sep 11, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-11T12:45:36.679Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-ft0i. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:45:40.905Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 11, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-11T12:45:41.574Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24323 instances, 2/0 CPUs, 30/168311 disk GB, 0/2397 SSD disk GB, 1/226 instance groups, 1/229 managed instance groups, 1/452 instance templates, 1/552 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 11, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:45:41.663Z: Cleaning up.
Sep 11, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:45:41.712Z: Worker pool stopped.
Sep 11, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:45:42.985Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 11, 2021 12:45:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-11_05_45_29-2969291427559089470 failed with status FAILED.
Sep 11, 2021 12:45:47 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): a220d965-d152-4485-9e32-8c692a5803a6 and timestamp: 2021-09-11T12:45:24.406000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210911124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec8a6b706ab30a1f4484e5b634dffcda5a41d1ce4d0225ad66f82959c24f0953
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210911124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec8a6b706ab30a1f4484e5b634dffcda5a41d1ce4d0225ad66f82959c24f0953]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210911124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec8a6b706ab30a1f4484e5b634dffcda5a41d1ce4d0225ad66f82959c24f0953])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec8a6b706ab30a1f4484e5b634dffcda5a41d1ce4d0225ad66f82959c24f0953
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec8a6b706ab30a1f4484e5b634dffcda5a41d1ce4d0225ad66f82959c24f0953
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec8a6b706ab30a1f4484e5b634dffcda5a41d1ce4d0225ad66f82959c24f0953].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 31s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bvfr5nun6tgdk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #242

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/242/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12712] Spark: Exclude looping timer tests.

[Kyle Weaver] [BEAM-13919] Annotate PerKeyOrderingTest with UsesStatefulParDo.

[noreply] Update 2.36.0 blog post to mention ARM64 support

[noreply] Minor: Disable checker framework in nightly snapshot (#16829)

[artur.khanin] Updated example link

[noreply] [BEAM-13860] Make `DoFn.infer_output_type` return element type (#16788)

[noreply] [BEAM-13894] Unit test utilities in the ptest package (#16830)

[Kenneth Knowles] Add test for processing time continuation trigger

[noreply] [BEAM-13922] [Coverage] Make boot.go more testable and add tests

[noreply] Exclude SpannerChangeStream IT from Dataflow V1 postcommit (#16851)

[noreply] [BEAM-13930] Address StateSpec consistency issue between Runner and Fn


------------------------------------------
[...truncated 1.06 MB...]
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-8' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
	at com.sun.proxy.$Proxy139.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1211)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1203)
	at hudson.Launcher$ProcStarter.join(Launcher.java:523)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:803)
	at hudson.model.Build$BuildExecution.build(Build.java:197)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:513)
	at hudson.model.Run.execute(Run.java:1906)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.io.IOException: Pipe closed after 0 cycles
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:118)
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:101)
	at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:93)
	at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:74)
	at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:104)
	at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-8 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #248

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/248/display/redirect>

Changes:


------------------------------------------
[...truncated 552.90 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during
Build timed out (after 240 minutes). Marking the build as aborted.
Agent went offline during the build
ERROR: Connection was broken: org.apache.sshd.common.channel.exception.SshChannelClosedException: write(ChannelOutputStream[ChannelExec[id=4, recipient=0]-ClientSessionImpl[jenkins@/35.222.180.153:22]] SSH_MSG_CHANNEL_DATA) len=2 - channel already closed
	at org.apache.sshd.common.channel.ChannelOutputStream.write(ChannelOutputStream.java:132)
	at java.io.OutputStream.write(OutputStream.java:75)
	at hudson.remoting.ChunkedOutputStream.sendFrame(ChunkedOutputStream.java:92)
	at hudson.remoting.ChunkedOutputStream.sendBreak(ChunkedOutputStream.java:65)
	at hudson.remoting.ChunkedCommandTransport.writeBlock(ChunkedCommandTransport.java:46)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.write(AbstractSynchronousByteArrayCommandTransport.java:46)
	at hudson.remoting.Channel.send(Channel.java:766)
	at hudson.remoting.Channel.close(Channel.java:1487)
	at hudson.remoting.Channel.close(Channel.java:1454)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:894)
	at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:108)
	at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:774)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #326

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/326/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #325

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/325/display/redirect?page=changes>

Changes:

[andyye333] Add extra details to PubSub matcher errors

[noreply] Merge pull request #17559 from [BEAM-14423] Add exception injection

[noreply] [BEAM-11104] Allow self-checkpointing SDFs to return without finishing

[noreply] Merge pull request #17544 from [BEAM-14415] Exception handling tests for

[noreply] Merge pull request #17565 from [BEAM-14413] add Kafka exception test

[noreply] Merge pull request #17555 from [BEAM-14417] Adding exception handling

[noreply] [BEAM-14433] Improve Go split error message. (#17575)

[noreply] [BEAM-14429] Force java load test on dataflow runner v2

[noreply] Merge pull request #17577 from [BEAM-14435] Adding exception handling

[noreply] [BEAM-14347] Add generic registration functions for iters and emitters

[noreply] [BEAM-14169] Add Credentials rotation cron job for clusters (#17383)

[noreply] [BEAM-14347] Add generic registration for accumulators (#17579)


------------------------------------------
[...truncated 50.08 KB...]
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
797e7535d562: Waiting
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
985538ac5212: Waiting
2d92778868f4: Waiting
77a8640680ae: Waiting
2cc68c5e343e: Waiting
6b0ff0872aca: Waiting
9cec090aca3b: Waiting
caafae3933f1: Waiting
08fa02ce37eb: Waiting
9212fcd3b523: Waiting
30e908a38a18: Waiting
cac2fff6ae3d: Waiting
a13c519c6361: Waiting
91a07825402e: Waiting
3d4bf3a3f7e6: Pushed
fe4ffff945d1: Pushed
762e10fd3428: Pushed
c72c53396536: Pushed
0fd104b53d9d: Pushed
797e7535d562: Pushed
91a07825402e: Pushed
2d92778868f4: Pushed
985538ac5212: Pushed
caafae3933f1: Pushed
77a8640680ae: Pushed
0a41459588e0: Layer already exists
2cc68c5e343e: Pushed
30e908a38a18: Layer already exists
cac2fff6ae3d: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
9cec090aca3b: Pushed
9212fcd3b523: Pushed
6b0ff0872aca: Pushed
20220510124335: digest: sha256:d1c24dcee958b41fcb356bbb43b48b56a66c3594825ab5d391bd6615ea4bab7d size: 4935

> Task :sdks:java:testing:load-tests:run
May 10, 2022 12:45:36 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged.
May 10, 2022 12:45:37 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
May 10, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution.
May 10, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds
May 10, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
May 10, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <120279 bytes, hash e3c08e414f187d38f2160df50af18d0256522d6f39e3ad93b6f4bfb9c110a52e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-48COQU8YfTjyFg31CvGNAlZSLW85462TtvS_ucEQpS4.pb
May 10, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
May 10, 2022 12:45:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 1 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4232b34a]
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
May 10, 2022 12:45:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 1 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ffd4cba]
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-10_05_45_43-11087100749768332113?project=apache-beam-testing
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-10_05_45_43-11087100749768332113
May 10, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-05-10_05_45_43-11087100749768332113
May 10, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-10T12:45:51.199Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-05-9f1o. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:55.650Z: Worker configuration: e2-standard-2 in us-central1-b.
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.172Z: Expanding SplittableParDo operations into optimizable parts.
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.193Z: Expanding CollectionToSingleton operations into optimizable parts.
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.283Z: Expanding CoGroupByKey operations into optimizable parts.
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.353Z: Expanding SplittableProcessKeyed operations into optimizable parts.
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.408Z: Expanding GroupByKey operations into streaming Read/Write steps
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.457Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.567Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.628Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.660Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.693Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.720Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.742Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.775Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.802Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.831Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.862Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.896Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.930Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:56.992Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.021Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.055Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.088Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.124Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.156Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.189Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.222Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.257Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 10, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.282Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
May 10, 2022 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.317Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 10, 2022 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.517Z: Running job using Streaming Engine
May 10, 2022 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:45:57.758Z: Starting 5 ****s in us-central1-b...
May 10, 2022 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:46:20.385Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 10, 2022 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:46:20.521Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 10, 2022 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T12:47:29.303Z: Workers have started successfully.
May 10, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T16:00:53.672Z: Cancel request is committed for workflow job: 2022-05-10_05_45_43-11087100749768332113.
May 10, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T16:00:53.758Z: Cleaning up.
May 10, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T16:00:53.845Z: Stopping **** pool...
May 10, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T16:00:53.901Z: Stopping **** pool...
May 10, 2022 4:01:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T16:01:31.024Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
May 10, 2022 4:01:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-10T16:01:31.095Z: Worker pool stopped.
May 10, 2022 4:01:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-05-10_05_45_43-11087100749768332113 finished with status CANCELLED.
Load test results for test (ID): 499e6d58-f7b6-4f54-9ac3-cf03a99f75f3 and timestamp: 2022-05-10T12:45:37.587000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11578.784
dataflow_v2_java11_total_bytes_count               9.2037332E9
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220510124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1c24dcee958b41fcb356bbb43b48b56a66c3594825ab5d391bd6615ea4bab7d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220510124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1c24dcee958b41fcb356bbb43b48b56a66c3594825ab5d391bd6615ea4bab7d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220510124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1c24dcee958b41fcb356bbb43b48b56a66c3594825ab5d391bd6615ea4bab7d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1c24dcee958b41fcb356bbb43b48b56a66c3594825ab5d391bd6615ea4bab7d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1c24dcee958b41fcb356bbb43b48b56a66c3594825ab5d391bd6615ea4bab7d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1c24dcee958b41fcb356bbb43b48b56a66c3594825ab5d391bd6615ea4bab7d].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:398247a3ea0a8a4d32219c6d040ab83c02ee8bb256e24d4f14ff6325a752e513
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:398247a3ea0a8a4d32219c6d040ab83c02ee8bb256e24d4f14ff6325a752e513
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:398247a3ea0a8a4d32219c6d040ab83c02ee8bb256e24d4f14ff6325a752e513].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 26s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/tm3elojxvleli

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #324

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/324/display/redirect?page=changes>

Changes:

[elias.segundo] Changing elegibility to AllNodeElegibility

[chamikaramj] Adds code reviewers for GCP I/O connectors and KafkaIO to Beam OWNERS


------------------------------------------
[...truncated 50.36 KB...]
0a41459588e0: Preparing
30e908a38a18: Preparing
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
dab9df729bdb: Waiting
a13c519c6361: Preparing
567d17764da3: Waiting
ad1330e8e2c2: Waiting
0ee803f1ace5: Waiting
0a41459588e0: Waiting
30e908a38a18: Waiting
8ace383ec62e: Waiting
37d549513643: Waiting
cac2fff6ae3d: Waiting
8181f27b5b0f: Waiting
6966898a4e6a: Waiting
d35bbc73c238: Waiting
5cf508509b0a: Waiting
a13c519c6361: Waiting
bafdbe68e4ae: Waiting
a037458de4e0: Waiting
3822c88d8ca1: Pushed
c9f02c6a5a95: Pushed
02b39b4f8b0b: Pushed
0df8f3ec7c98: Pushed
4ce42ed94186: Pushed
8ace383ec62e: Pushed
ad1330e8e2c2: Pushed
d35bbc73c238: Pushed
8181f27b5b0f: Pushed
dab9df729bdb: Pushed
0ee803f1ace5: Pushed
37d549513643: Pushed
0a41459588e0: Layer already exists
cac2fff6ae3d: Layer already exists
30e908a38a18: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
567d17764da3: Pushed
5cf508509b0a: Pushed
6966898a4e6a: Pushed
20220509124335: digest: sha256:b2d9c303acfcc8955d7b444741c7569d57137c4c82395a560936cb00f2c8f3c5 size: 4935

> Task :sdks:java:testing:load-tests:run
May 09, 2022 12:45:41 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 09, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged.
May 09, 2022 12:45:42 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 09, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
May 09, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution.
May 09, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds
May 09, 2022 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
May 09, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <120277 bytes, hash 67bd78397b615dde98d6de143f9e230ad70ee5b644cb427036a82962570ddfa2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Z714OXthXd6Y1t4UP54jCtcO5bZEy0JwNqgpYlcN36I.pb
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
May 09, 2022 12:45:52 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@751ae8a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@235d659c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4232b34a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da16263, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5ce0bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47e51549, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@101a461c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@360e9c06, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ebffb44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@311ff287, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7377781e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31db34da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@109f8c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5edacf20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a5eb6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e307087, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1220ef43, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a8b81e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@234cff57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e8507f1]
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
May 09, 2022 12:45:52 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bf54172, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c9a6717, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b3cde6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d091cad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c663eaf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bb5ceb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e692555, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ba0ae41, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76fe6cdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2ffb3aec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@786ff1cb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46039a21, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@431e86b1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35c4e864, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32a2a6be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@682af059, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f36c8e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4da39ca9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a9344f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5584d9c6]
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 09, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
May 09, 2022 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-09_05_45_52-14263185450666302757?project=apache-beam-testing
May 09, 2022 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-09_05_45_52-14263185450666302757
May 09, 2022 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-05-09_05_45_52-14263185450666302757
May 09, 2022 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-09T12:45:57.782Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-05-770f. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:03.101Z: Worker configuration: e2-standard-2 in us-central1-b.
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:03.732Z: Expanding SplittableParDo operations into optimizable parts.
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:03.763Z: Expanding CollectionToSingleton operations into optimizable parts.
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:03.883Z: Expanding CoGroupByKey operations into optimizable parts.
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:03.943Z: Expanding SplittableProcessKeyed operations into optimizable parts.
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:03.980Z: Expanding GroupByKey operations into streaming Read/Write steps
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.081Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
May 09, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.220Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.249Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.272Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.293Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.322Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.359Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.394Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.425Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.456Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.488Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.539Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.576Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.602Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.634Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.659Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.731Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.780Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.834Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.896Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.948Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:04.995Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:05.082Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:05.255Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:05.414Z: Running job using Streaming Engine
May 09, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:05.672Z: Starting 5 ****s in us-central1-b...
May 09, 2022 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:12.974Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 09, 2022 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:46:28.845Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 09, 2022 12:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T12:47:37.954Z: Workers have started successfully.
May 09, 2022 4:01:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T16:01:00.369Z: Cancel request is committed for workflow job: 2022-05-09_05_45_52-14263185450666302757.
May 09, 2022 4:01:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T16:01:00.454Z: Cleaning up.
May 09, 2022 4:01:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T16:01:00.558Z: Stopping **** pool...
May 09, 2022 4:01:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T16:01:00.632Z: Stopping **** pool...
May 09, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T16:01:41.149Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
May 09, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-09T16:01:41.189Z: Worker pool stopped.
May 09, 2022 4:01:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-05-09_05_45_52-14263185450666302757 finished with status CANCELLED.
Load test results for test (ID): 2105707b-01ae-43cf-8e21-0df3be531961 and timestamp: 2022-05-09T12:45:42.510000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11571.163
dataflow_v2_java11_total_bytes_count             3.22735089E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220509124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2d9c303acfcc8955d7b444741c7569d57137c4c82395a560936cb00f2c8f3c5
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220509124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2d9c303acfcc8955d7b444741c7569d57137c4c82395a560936cb00f2c8f3c5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220509124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2d9c303acfcc8955d7b444741c7569d57137c4c82395a560936cb00f2c8f3c5])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2d9c303acfcc8955d7b444741c7569d57137c4c82395a560936cb00f2c8f3c5
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2d9c303acfcc8955d7b444741c7569d57137c4c82395a560936cb00f2c8f3c5
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2d9c303acfcc8955d7b444741c7569d57137c4c82395a560936cb00f2c8f3c5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 32s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/5ovvwz36y4a7e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #323

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/323/display/redirect>

Changes:


------------------------------------------
[...truncated 96.13 KB...]
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(Request(url, headers={'User-Agent': 'Apache Beam'}))
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(Request(url, headers={'User-Agent': 'Apache Beam'}))
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(Request(url, headers={'User-Agent': 'Apache Beam'}))
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(Request(url, headers={'User-Agent': 'Apache Beam'}))
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(Request(url, headers={'User-Agent': 'Apache Beam'}))
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt after 9 retries.
ERROR:root:['jFormatString-3.0.0', 'spotbugs-annotations-4.0.6', 'checkstyle-8.23']
ERROR:root:**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]
INFO:root:pull_licenses_java.py failed. It took 186.739673 seconds with 16 threads.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 321, in <module>
    raise RuntimeError('{n} error(s) occurred.'.format(n=len(error_msg)),
RuntimeError: ('1 error(s) occurred.', ['**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]'])

> Task :sdks:java:container:pullLicenses FAILED
> Task :sdks:java:container:goPrepare UP-TO-DATE

> Task :sdks:java:container:goBuild
/home/jenkins/go/bin/go1.18.1 build -o ./build/target/linux_amd64/boot boot.go

> Task :sdks:java:container:java11:copySdkHarnessLauncher
Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons:
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem.
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3m 43s
103 actionable tasks: 66 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/lb2b4fsrmedkm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #322

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/322/display/redirect?page=changes>

Changes:

[kevinsijo] Setting up a basic directory

[kevinsijo] Mirroring Python SDK's directory structure

[kerrydc] Adds initial tests

[kevinsijo] 'runners' is the correct directory name

[Pablo Estrada] sketching the core API for JS SDK

[jonathanlui] add .gitignore for node/ts project

[Robert Bradshaw] Worker directory.

[Robert Bradshaw] Fix complile errors with explicit any for callables.

[Robert Bradshaw] Add worker entry point.

[Robert Bradshaw] Add proto generation code.

[Robert Bradshaw] Add generated proto files.

[Robert Bradshaw] Attempts to get ts protos to compile.

[Robert Bradshaw] Exclude ts protos for now.

[Robert Bradshaw] More changes to get ts protos working.

[Robert Bradshaw] Update scripts and config to get protos compiling.

[Robert Bradshaw] Update geenrated files.

[jonathanlui] add build and clean script to compile ts

[Robert Bradshaw] Generate server for loopback worker.

[Robert Bradshaw] Generated grpc servers for loopback.

[Robert Bradshaw] Add typescript formatter.

[Robert Bradshaw] Loopback server (that does nothing).

[Robert Bradshaw] Working server.

[Pablo Estrada] Starting expansion of primitive transforms

[Pablo Estrada] Starting to implement and support standard coders

[Robert Bradshaw] Also generate grpc clients.

[Robert Bradshaw] Basic implementation of worker harness.

[Pablo Estrada] fix the build

[Robert Bradshaw] Add some missing files for worker harness.

[Robert Bradshaw] Refactor operators to use registration.

[jonathanlui] enable ts in mocha

[jonathanlui] update readme

[jonathanlui] --save-dev @types/mocha

[jonathanlui] translate core_test.js to typescript

[Robert Bradshaw] Encapsulate worker service in a class.

[Kenneth Knowles] Port standard_coders_test to typescript (superficially)

[Pablo Estrada] Starting the proto translation of Impulse, ParDo, GBK

[Robert Bradshaw] Add some tests for the worker code.

[Robert Bradshaw] Fixing old lock file error.

[Pablo Estrada] Adding transform names and fixing GBK coder issue

[Robert Bradshaw] npx tsfmt -r src/apache_beam/base.ts src/apache_beam/transforms/core.ts

[Kenneth Knowles] switch to import style require() statements

[Kenneth Knowles] Add Coder interface using protobufjs classes

[Kenneth Knowles] BytesCoder with some failures

[noreply] Added GeneralObjectCoder and using it as coder for most transforms (#9)

[Kenneth Knowles] Fix order of arguments to deepEqual

[Kenneth Knowles] Encode expected encoding as binary

[Robert Bradshaw] Refactor API to allow for composites.

[jrmccluskey] Initial setup for automated Java expansion startup

[jrmccluskey] Update exp_service.ts

[Kenneth Knowles] Fix up coder deserialization

[Robert Bradshaw] Simplify GBK coder computation.

[Robert Bradshaw] Remove top-level PValue.

[Pablo Estrada] Make tests green

[Robert Bradshaw] Rename PValueish to PValue.

[jonathanlui] node runner

[jonathanlui] whitespaces

[Robert Bradshaw] Make Runner.run async.

[jonathanlui] bson and fast-deep-equal should not be listed as devdependency

[jrmccluskey] Add basic Dockerfile that starts ExternalWorkerPool

[Robert Bradshaw] Direct runner.

[kevinsijo] Testing expansion service communication

[Robert Bradshaw] Added flatten, assertion checkers.

[Pablo Estrada] progress on basic coders

[Robert Bradshaw] Fixing the build.

[Robert Bradshaw] Cleanup, simplify access.

[Pablo Estrada] Adding limited support for KVCoder and IterableCoder

[Robert Bradshaw] Introduce PipelineContext.

[Robert Bradshaw] Add toProto to all coders.

[Robert Bradshaw] Some work with coders.

[Robert Bradshaw] Remove debug logging.

[Robert Bradshaw] Use coders over data channel.

[Kenneth Knowles] explicitly sequence sub-coder serializations

[Kenneth Knowles] no more need to extend FakeCoder

[Kenneth Knowles] actually advance reader

[Kenneth Knowles] autoformat

[Kenneth Knowles] protobufjs already can write and read signed varints

[Kenneth Knowles] with improved test harness, kv has many more failures

[Kenneth Knowles] read bytescoder from correct position

[Kenneth Knowles] no more fake coders

[Kenneth Knowles] varint examples all work

[Kenneth Knowles] simplify coder value parsing

[Kenneth Knowles] global window coder

[Kenneth Knowles] fix swapEndian32

[Robert Bradshaw] Add P(...) operator.

[kevinsijo] Implementing RowCoder encoding.

[jrmccluskey] remove unused container dir

[kevinsijo] Corrected sorting of encoded positions to reflect an argsort instead.

[Robert Bradshaw] Populate environments.

[kevinsijo] Implementing RowCoder decoding.

[Kenneth Knowles] preliminary unbounded iterable coder

[Kenneth Knowles] friendlier description of standard coder test case

[Kenneth Knowles] fix test harness; iterable works

[jrmccluskey] first pass at boot.go

[jonathanlui] update package-lock.json

[jonathanlui] make NodeRunner a subclass of Runner

[jonathanlui] add waitUntilFinish interface member

[Pablo Estrada] Adding double coder

[Kenneth Knowles] scaffolding for windowed values

[Pablo Estrada] Adding type information to PColleciton and PTransform

[jonathanlui] fix direct runner

[Pablo Estrada] Adding typing information for DoFns

[Kenneth Knowles] add interval window

[Robert Bradshaw] Export PValue.

[Robert Bradshaw] Add CombineFn interface.

[Robert Bradshaw] Typed flatten.

[jonathanlui] add runAsync method to base.Runner

[Kenneth Knowles] add Long package

[Pablo Estrada] Adding more types. Making PValue typed

[Kenneth Knowles] instant coder draft

[Robert Bradshaw] Return job state from direct runner.

[Kenneth Knowles] type instant = long

[jonathanlui] implement NodeRunner.runPipeline

[Kenneth Knowles] autoformat

[kevinsijo] Completed implementation of basic row coder

[Kenneth Knowles] Fix IntervalWindowCoder, almost

[Kenneth Knowles] fix interval window coder

[Kenneth Knowles] autoformat

[Robert Bradshaw] loopback runner works

[Kenneth Knowles] move core element types into values.ts

[Kenneth Knowles] just build object directly to be cool

[Robert Bradshaw] GBK working on ULR.

[Robert Bradshaw] Async transforms.

[Robert Bradshaw] External transform grpah splicing.

[Kenneth Knowles] progress on windowed value: paneinfo encoding

[Robert Bradshaw] Fix merge.

[Robert Bradshaw] autoformat

[Kenneth Knowles] full windowed value coder

[kerrydc] Updates tests to use correct types, adds generics where needed to DoFns

[Robert Bradshaw] Add serialization librarires.'

[Robert Bradshaw] Add Split() PTransform, for producing multiple outputs from a single

[Robert Bradshaw] Schema-encoded external payloads.

[kevinsijo] Adding Schema inference from JSON

[Pablo Estrada] Removing unused directories

[Pablo Estrada] Support for finishBundle and improving typing annotations.

[Pablo Estrada] A base implementation of combiners with GBK/ParDo

[Robert Bradshaw] Fully propagate windowing information in both remote and direct runner.

[Robert Bradshaw] Make args and kwargs optional for python external transform.

[Robert Bradshaw] Infer schema for external transforms.

[Pablo Estrada] Implementing a custom combine fn as an example. Small fixes

[Robert Bradshaw] Fix missing windowing information in combiners.

[Robert Bradshaw] PostShuffle needn't group by key as that's already done.

[Robert Bradshaw] Guard pre-combine for global window only.

[Robert Bradshaw] WindowInto

[Robert Bradshaw] Fix optional kwargs.

[Robert Bradshaw] A couple of tweaks for js + py

[Robert Bradshaw] Add windowing file.

[Robert Bradshaw] CombineBy transform, stand-alone WordCount.

[Robert Bradshaw] cleanup

[Robert Bradshaw] Actually fix optional external kwargs.

[Robert Bradshaw] Demo2, textio read.

[Robert Bradshaw] Add command lines for starting up the servers.

[Robert Bradshaw] Run prettier on the full codebase.

[Robert Bradshaw] Update deps.

[Pablo Estrada] Adding docstrings for core.ts. Prettier dependency

[Pablo Estrada] Documenting coder interfaces

[Pablo Estrada] Added documentation for a few standard coders

[Robert Bradshaw] Unified grouping and combining.

[Robert Bradshaw] Allow PCollection ids to be lazy.

[Robert Bradshaw] Reorganize module structure.

[Robert Bradshaw] A couple more renames.

[Robert Bradshaw] Simplify.

[Robert Bradshaw] Consolidation.

[Robert Bradshaw] Fix build.

[Robert Bradshaw] Add optional context to ParDo.

[Robert Bradshaw] fixup: iterable coder endian sign issue

[Robert Bradshaw] omit context for map(console.log)

[Robert Bradshaw] Fix ReadFromText coders.

[Robert Bradshaw] Flesh out README with overview and current state.

[noreply] Readme typo

[Robert Bradshaw] Two more TODOs.

[noreply] Add a pointer to the example wordcount to the readme.

[Pablo Estrada] Documenting coders and implementing unknown-length method

[Robert Bradshaw] UIID dependency.

[Robert Bradshaw] Artifact handling.

[Robert Bradshaw] Properly wait on data channel for bundle completion.

[Robert Bradshaw] Automatic java expansion service startup.

[Robert Bradshaw] Process promises.

[Robert Bradshaw] Implement side inputs.

[Robert Bradshaw] Cleanup.

[Robert Bradshaw] Put complex constext stuff in its own file.

[Robert Bradshaw] Rename BoundedWindow to just Window.

[Robert Bradshaw] Alternative splitter class.

[Pablo Estrada] Documenting internal functions

[Robert Bradshaw] Take a pass clarifying the TODOs.

[Robert Bradshaw] Sql transform wrapper.

[Robert Bradshaw] Incorporate some feedback into the TODOs.

[Robert Bradshaw] More TODOs.

[Robert Bradshaw] Remove app placeholder.

[Robert Bradshaw] Apache license headers.

[Robert Bradshaw] More TODOs

[jankuehle] Suggestions for TypeScript todos

[dannymccormick] Add actions for typescript sdk

[dannymccormick] Fix test command

[noreply] Add missing version

[dannymccormick] Fix codecovTest command

[noreply] Only do prettier check on linux

[noreply] Only get codecov on linux

[Robert Bradshaw] Resolve some comments.

[Robert Bradshaw] Fix compile errors.

[Robert Bradshaw] Prettier.

[Robert Bradshaw] Re-order expandInternal arguments pending unification.

[Robert Bradshaw] More consistent and stricter PTransform naming.

[Robert Bradshaw] Notes on explicit, if less idiomatic, use of classes.

[Robert Bradshaw] Let DoFn be an interface rather than a class.

[Robert Bradshaw] Provide DoFn context to start and finish bundle.

[Robert Bradshaw] Optional promise code simplification.

[Robert Bradshaw] Cleanup todos.

[Robert Bradshaw] Avoid any type where not needed.

[Robert Bradshaw] Apache RAT excludes for typescript.

[Robert Bradshaw] Remove empty READMEs.

[Robert Bradshaw] Add licences statement to readme files.

[Robert Bradshaw] More RAT fixes.

[Robert Bradshaw] Another unsupported coder.

[Robert Bradshaw] Remove debugging code.

[noreply] Fix automatic naming with code coverage.

[Robert Bradshaw] Coders cleanup.

[Robert Bradshaw] Add tests for RowCoder.

[Robert Bradshaw] Normalize capitalization, comments.

[Robert Bradshaw] Install typescript closure packages.

[Robert Bradshaw] npm audit fix

[Robert Bradshaw] Move more imports out of base.

[Robert Bradshaw] Changes needed to compile with ts closure plugin.

[Robert Bradshaw] Use ttsc and ts-closure-transform plugin.

[Robert Bradshaw] Serialization registration to actually get serialization working.

[Robert Bradshaw] Container images working on local runner.

[Robert Bradshaw] Add a portable job server that proxies the Dataflow backend. (#17189)

[Robert Bradshaw] Improvements to dataflow job service for non-Python jobs.

[Robert Bradshaw] Get dataflow working.

[Robert Bradshaw] User friendly pipeline options.

[Robert Bradshaw] Less classes, more functions.

[Robert Bradshaw] Add new nullable standard coder.

[Robert Bradshaw] Make Apache Rat happy.

[Robert Bradshaw] Disable broken codecov.

[Robert Bradshaw] Remove last uses of base.ts.

[Robert Bradshaw] Remove unneedd file.

[Robert Bradshaw] Remove more uneeded/unused files.

[Robert Bradshaw] Cleanup tests.

[Robert Bradshaw] Minor cleanups to coder tests.

[noreply] Quote pip install package name

[noreply] [BEAM-14374] Fix module import error in FullyQualifiedNamedTransform

[Robert Bradshaw] Addressing issues from the review.

[noreply] Apply suggestions from code review.

[Robert Bradshaw] Post-merge fixes.

[dannymccormick] Delete tags.go

[Robert Bradshaw] Update tests to use our actual serialization libraries.

[Robert Bradshaw] Another pass at TODOs, removing finished items.

[Heejong Lee] [BEAM-14146] Python Streaming job failing to drain with BigQueryIO write

[Kenneth Knowles] Add parameter for service account impersonation in GCP credentials

[Heejong Lee] add test

[noreply] Merge pull request #17490 from [BEAM-14370] [Website] Add new page about

[noreply] [BEAM-14332] Refactored cluster management for Flink on Dataproc

[noreply] [BEAM-13988] Update mtime to use time.UnixMilli() calls (#17578)

[noreply] Fixing patching error on missing dependencies (#17564)

[noreply] Merge pull request #17517 from [BEAM-14383] Improve "FailedRows" errors

[Heejong Lee] add test without mock


------------------------------------------
[...truncated 49.75 KB...]
30e908a38a18: Preparing
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
472be04c22c5: Waiting
62b86eaec16e: Waiting
eaf4977e4b8d: Waiting
0a41459588e0: Waiting
a037458de4e0: Waiting
30e908a38a18: Waiting
8cf0d33f2754: Waiting
bafdbe68e4ae: Waiting
808b623c3a5d: Waiting
cac2fff6ae3d: Waiting
a13c519c6361: Waiting
30a516cb300f: Waiting
20914767102e: Waiting
8e0917e7bf32: Waiting
08fa02ce37eb: Waiting
293397c337c4: Waiting
19fbe9594991: Waiting
d5ec5ce3f027: Pushed
60c29b9575bb: Pushed
50c7249ab501: Pushed
f5db9462bafd: Pushed
8a49b9f1b076: Pushed
472be04c22c5: Pushed
62b86eaec16e: Pushed
8cf0d33f2754: Pushed
8e0917e7bf32: Pushed
eaf4977e4b8d: Pushed
293397c337c4: Pushed
0a41459588e0: Layer already exists
30e908a38a18: Layer already exists
cac2fff6ae3d: Layer already exists
20914767102e: Pushed
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
30a516cb300f: Pushed
19fbe9594991: Pushed
808b623c3a5d: Pushed
20220507124336: digest: sha256:0c861604a31b7296fff5e420538201f60d7b43fb16b894665d57d5ecec8ba232 size: 4935

> Task :sdks:java:testing:load-tests:run
May 07, 2022 12:45:28 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 07, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged.
May 07, 2022 12:45:29 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 07, 2022 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
May 07, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution.
May 07, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds
May 07, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
May 07, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <120277 bytes, hash eedb4afc768e521e1986f4137319587062f48406c1037e7b4f9aa2b0bba67aba> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7ttK_HaOUh4ZhvQTcxlYcGL0hAbBA357T5qisLumero.pb
May 07, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
May 07, 2022 12:45:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@751ae8a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@235d659c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4232b34a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da16263, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5ce0bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47e51549, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@101a461c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@360e9c06, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ebffb44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@311ff287, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7377781e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31db34da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@109f8c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5edacf20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a5eb6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e307087, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1220ef43, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a8b81e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@234cff57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e8507f1]
May 07, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
May 07, 2022 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bf54172, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c9a6717, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b3cde6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d091cad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c663eaf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bb5ceb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e692555, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ba0ae41, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76fe6cdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2ffb3aec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@786ff1cb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46039a21, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@431e86b1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35c4e864, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32a2a6be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@682af059, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f36c8e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4da39ca9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a9344f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5584d9c6]
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 07, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
May 07, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-07_05_45_35-13121460534877176794?project=apache-beam-testing
May 07, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-07_05_45_35-13121460534877176794
May 07, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-05-07_05_45_35-13121460534877176794
May 07, 2022 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-07T12:45:39.972Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-05-qa4m. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 07, 2022 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:46.703Z: Worker configuration: e2-standard-2 in us-central1-b.
May 07, 2022 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.501Z: Expanding SplittableParDo operations into optimizable parts.
May 07, 2022 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.532Z: Expanding CollectionToSingleton operations into optimizable parts.
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.589Z: Expanding CoGroupByKey operations into optimizable parts.
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.648Z: Expanding SplittableProcessKeyed operations into optimizable parts.
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.681Z: Expanding GroupByKey operations into streaming Read/Write steps
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.740Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.825Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.855Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.880Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.904Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.924Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.948Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:47.977Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.000Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.034Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.067Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.101Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.135Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.166Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.194Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.260Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.295Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.328Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.362Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.420Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.442Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.474Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.510Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.544Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.716Z: Running job using Streaming Engine
May 07, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:45:48.927Z: Starting 5 ****s in us-central1-b...
May 07, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:46:06.670Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 07, 2022 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:46:12.063Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 07, 2022 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T12:47:14.221Z: Workers have started successfully.
May 07, 2022 4:01:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T16:01:01.355Z: Cancel request is committed for workflow job: 2022-05-07_05_45_35-13121460534877176794.
May 07, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T16:01:06.708Z: Cleaning up.
May 07, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T16:01:06.847Z: Stopping **** pool...
May 07, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T16:01:06.894Z: Stopping **** pool...
May 07, 2022 4:01:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T16:01:47.299Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
May 07, 2022 4:01:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-07T16:01:47.341Z: Worker pool stopped.
May 07, 2022 4:01:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-05-07_05_45_35-13121460534877176794 finished with status CANCELLED.
Load test results for test (ID): 5db9bc63-8211-4fe4-b54b-4672791cdbd0 and timestamp: 2022-05-07T12:45:29.019000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11601.862
dataflow_v2_java11_total_bytes_count             2.58296131E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220507124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0c861604a31b7296fff5e420538201f60d7b43fb16b894665d57d5ecec8ba232
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220507124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0c861604a31b7296fff5e420538201f60d7b43fb16b894665d57d5ecec8ba232]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220507124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0c861604a31b7296fff5e420538201f60d7b43fb16b894665d57d5ecec8ba232])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0c861604a31b7296fff5e420538201f60d7b43fb16b894665d57d5ecec8ba232
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0c861604a31b7296fff5e420538201f60d7b43fb16b894665d57d5ecec8ba232
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0c861604a31b7296fff5e420538201f60d7b43fb16b894665d57d5ecec8ba232].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 43s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/6c3jes2ccp54u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #321

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/321/display/redirect?page=changes>

Changes:

[zyichi] Move master readme.md to 2.40.0

[noreply] [BEAM-14173] Fix Go Loadtests on Dataflow & partial fix for Flink

[noreply] Upgrade python sdk container requirements. (#17549)

[noreply] Merge pull request #17497: [BEAM-11205] Update GCP Libraries BOM version

[noreply] [BEAM-12603] Add retry on grpc data channel and remove retry from test.

[noreply] Merge pull request #17359: [BEAM-14303] Add a way to exclude output

[noreply] [BEAM-14347] Allow users to optimize DoFn execution with a single

[noreply] [BEAM-5878] Add (failing) kwonly-argument test (#17509)


------------------------------------------
[...truncated 59.77 KB...]
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
160285df056b: Pushed
bbdb0e929abe: Pushed
537b2f1e73b8: Pushed
20220506124348: digest: sha256:ba2fd6e85f190d09462a32fd5b30794fd9cde4b8797fee19edb9f341e25f8814 size: 4935

> Task :sdks:java:testing:load-tests:run
May 06, 2022 12:46:50 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 06, 2022 12:46:51 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged.
May 06, 2022 12:46:52 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 06, 2022 12:46:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
May 06, 2022 12:46:54 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution.
May 06, 2022 12:46:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 1 seconds
May 06, 2022 12:46:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
May 06, 2022 12:46:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <120277 bytes, hash 5f42cb289df306e80c4cda4705da4cb0d8fc969b22935e6aeca71d9e0cd9e2cf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-X0LLKJ3zBugMTNpHBdpMsNj8lpsik15q7KcdngzZ4s8.pb
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
May 06, 2022 12:46:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47e51549, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@101a461c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@360e9c06, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ebffb44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@311ff287, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7377781e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31db34da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@109f8c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5edacf20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a5eb6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e307087, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1220ef43, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a8b81e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@234cff57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e8507f1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bcaa195, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d08edc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49fa1d74, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f362135, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21eee94f]
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
May 06, 2022 12:46:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c663eaf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bb5ceb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e692555, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ba0ae41, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76fe6cdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2ffb3aec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@786ff1cb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46039a21, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@431e86b1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35c4e864, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32a2a6be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@682af059, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f36c8e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4da39ca9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a9344f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5584d9c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c9c6245, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d0be7ab, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d4fb213, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ef60295]
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 06, 2022 12:46:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
May 06, 2022 12:46:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-06_05_46_58-15448386458134042650?project=apache-beam-testing
May 06, 2022 12:46:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-06_05_46_58-15448386458134042650
May 06, 2022 12:46:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-05-06_05_46_58-15448386458134042650
May 06, 2022 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-06T12:47:40.853Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-05-9zyj. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:47.585Z: Worker configuration: e2-standard-2 in us-central1-b.
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:48.623Z: Expanding SplittableParDo operations into optimizable parts.
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:48.790Z: Expanding CollectionToSingleton operations into optimizable parts.
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:48.916Z: Expanding CoGroupByKey operations into optimizable parts.
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.003Z: Expanding SplittableProcessKeyed operations into optimizable parts.
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.034Z: Expanding GroupByKey operations into streaming Read/Write steps
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.106Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.199Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.234Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.278Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.310Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.345Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.380Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.406Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.429Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.475Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.495Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.531Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.566Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.599Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.631Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.666Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.699Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.732Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.757Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.791Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.825Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.859Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.903Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:49.931Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:50.061Z: Running job using Streaming Engine
May 06, 2022 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:47:50.278Z: Starting 5 ****s in us-central1-b...
May 06, 2022 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:48:12.794Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 06, 2022 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:48:13.704Z: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 06, 2022 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:48:13.728Z: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
May 06, 2022 12:48:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:48:23.938Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 06, 2022 12:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T12:49:22.237Z: Workers have started successfully.
May 06, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T16:00:49.680Z: Cancel request is committed for workflow job: 2022-05-06_05_46_58-15448386458134042650.
May 06, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T16:00:49.736Z: Cleaning up.
May 06, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T16:00:49.847Z: Stopping **** pool...
May 06, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T16:00:49.901Z: Stopping **** pool...
May 06, 2022 4:01:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T16:01:22.621Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
May 06, 2022 4:01:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-06T16:01:22.675Z: Worker pool stopped.
May 06, 2022 4:01:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-05-06_05_46_58-15448386458134042650 finished with status CANCELLED.
Load test results for test (ID): d0867457-ca6a-4f9d-a8c1-86f2a8ade536 and timestamp: 2022-05-06T12:46:51.915000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11460.032
dataflow_v2_java11_total_bytes_count             3.65416332E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220506124348
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ba2fd6e85f190d09462a32fd5b30794fd9cde4b8797fee19edb9f341e25f8814
Deleted: sha256:efd7aa4a8979becbf291eff6ef964cee8383d6d74a725bb7fb9b124dcb12a80c
Deleted: sha256:f1f496d9047af3aa9054c8a465fb59fe21121a033055e3eeb1015f30d46fcdb6
Deleted: sha256:e2034096e422626f0c9523b1119ed0fa13e2e79182dc5dbf6d64cda67b242159
Deleted: sha256:cf2276d761c8fa73ba3f36e1fdd10629142d411eeca691b5eeff5563a7852aac
Deleted: sha256:9f2984463d6a20a033d97d1f28a049652d297d83588737107a5ce8a9f1a813e1
Deleted: sha256:26abe58f828ef30395f4a05c68733e283489a84601b5f359579d54db3c23ad72
Deleted: sha256:de5f6d9cea8882883489ab872098616f0fb38652668993c65b9ad90a9e2ddf97
Deleted: sha256:9e57cb002736af1919f0373dc209b6da14b8e9d73b80c11c624312d0ce9bad69
Deleted: sha256:e1a446f1c4b38c1abcadfa962ed1818d24e948ca1703daadfea0b45554e8b25b
Deleted: sha256:984ac649f3ecd659514d35f58bd2cc1dd782e6a917b88c0050239af6bdc136b3
Deleted: sha256:44b5a94f46223baf5cf37fc90a817b86312a964d4128a45fdeb8c65da27cd2c5
Deleted: sha256:b0c9e3f7625ced85e62f59ba554839f04f29ea949a11a091593d5f2f1cc8cc96
Deleted: sha256:fbc421738dec4bd6a8ca6ed2dc64ae1e02ccf329563c82dc98a79c8321d1b0ae
Deleted: sha256:02314c0594d4415b3d2dde1472a649569bad3e13324d99a1a7ac6b6656b78516
Deleted: sha256:6e4b5f876168ec05825279d1086a65e82dcfcbf86ed4f6d197902fa14a689c61
Deleted: sha256:72351c7a7644a8af3854b79a09e2ae5f7980b685442bf7fadb6b611cb5c8eda1
Deleted: sha256:a77101578a52300d3acdcaf4a30bbab6b7f17aac8d14d91ddbae03acc288878f
Deleted: sha256:9813745959d5370e1acd8b763b7129e02d634c69c4b072b9baae9c470f7d2d7d
Deleted: sha256:9269915bba8e5d5335b44fd34f05e5ace269196a98456dd6f8446371bae5c702
Deleted: sha256:66b316bf81222196d1242ad220c01a1e4240901fa322c11ece9e04236b3a815f
Deleted: sha256:c4c06addb9aa73d496773ed77485c8f8840d1d90c6877878ea8cec4061239709
Deleted: sha256:3b7c9fda683172705762b77ce66c91881dd926111168db4cd6a048e6f0f3cd67
Deleted: sha256:a04d0ffe5f60b9fffdce971e9c6c6686b1bca77e00a50b62f338805b2fa9de23
Deleted: sha256:9a09b2a17f4e493550770e95bb1d12ec92f87882a331b65e46e487f6495797c5
Deleted: sha256:1b89c28538b6adbcad758ae37b8a7fcd6d250a2175c8ecbb0a67285398da9157
Deleted: sha256:cbe11ca3036dce679b149d05ce112613e38fc651b68d4e07a89b25e0e00d17c4
Deleted: sha256:2cff29e340db47c4d1cfdd18da81607c7346f60c369b83ee5e7e8fa6a8fc75fc
Deleted: sha256:211b405413aaf257cd64ed42ffb3cb3ec3a8d6028f524bc12226d26701cd1029
Deleted: sha256:a54f493a7db6785a39a53ab84dd2d2a1703278c23d38ffabc0061586fffb2137
Deleted: sha256:e8d1a46501e7b5df6a80b0f80cd277e4406e65dd0369c07323f723fea8c3fbba
Deleted: sha256:dd56aca3632e26390db940b8fb26b536da8d093c449b87845182df2e0e86a3d0
Deleted: sha256:ed39802e7f1411fe8af4249f213ffb4a0884a50bcc04edeeec403f670b396f8a
Deleted: sha256:39b721d0ec1131d7587e460be6986dd132923e4a8088096cce9f69fa5087b488
Deleted: sha256:7ab7f5535c47e8db95f971b2e4baf1e41d0147c13b464d964430936ce69c7177
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220506124348]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ba2fd6e85f190d09462a32fd5b30794fd9cde4b8797fee19edb9f341e25f8814]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220506124348] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ba2fd6e85f190d09462a32fd5b30794fd9cde4b8797fee19edb9f341e25f8814])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ba2fd6e85f190d09462a32fd5b30794fd9cde4b8797fee19edb9f341e25f8814
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ba2fd6e85f190d09462a32fd5b30794fd9cde4b8797fee19edb9f341e25f8814
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ba2fd6e85f190d09462a32fd5b30794fd9cde4b8797fee19edb9f341e25f8814].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 10s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cbz3il6j33zw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #320

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/320/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-9245] Unable to pull datatore Entity which contains dict

[bulat.safiullin] [BEAM-14382] [Website] add banner container for with css, images, html

[Jan Lukavský] [BEAM-14196] add test verifying output watermark propagation in bundle

[Jan Lukavský] [BEAM-14196] Fix FlinkRunner mid-bundle output watermark handling

[nielm] [BEAM-14405] Fix NPE when ProjectID is not specified in a template

[bulat.safiullin] [BEAM-14382] change mobile banner img, add padding to banner section

[ahmedabualsaud] fix test decotrator typo

[noreply] Merge pull request #17440 from [BEAM-14329] Enable exponential backoff

[noreply] [BEAM-11104] Fix output forwarding issue for ProcessContinuations

[noreply] re-add testing package to pydoc (#17524)

[Heejong Lee] add test

[noreply] [BEAM-14250] Amended the workaround (#17531)

[noreply] [BEAM-11104] Fix broken split result validation (#17546)

[noreply] Fixed a SQL and screenshots in the Beam SQL blog (#17545)

[noreply] Merge pull request #17417: [BEAM-14388] Address some performance

[noreply] [BEAM-14386] [Flink] Support for scala 2.12 (#17512)

[noreply] [BEAM-14294] Worker changes to support trivial Batched DoFns (#17384)

[zyichi] Moving to 2.40.0-SNAPSHOT on master branch.

[noreply] [BEAM-14048] [CdapIO] Add ConfigWrapper for building CDAP PluginConfigs


------------------------------------------
[...truncated 50.63 KB...]
0a41459588e0: Preparing
30e908a38a18: Preparing
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
75bb19b3d7ee: Waiting
f42324cfd4a5: Waiting
d2c4886e7c6c: Waiting
147da672cee9: Waiting
37a6d0a26f1a: Waiting
63263f35234e: Waiting
07c6267056f9: Waiting
659b5fea55d9: Waiting
30e908a38a18: Waiting
08fa02ce37eb: Waiting
0a41459588e0: Waiting
615950b7fc1a: Waiting
cac2fff6ae3d: Waiting
83f416a905e1: Waiting
bafdbe68e4ae: Waiting
a13c519c6361: Waiting
4d27ab4a3cc1: Pushed
f9b097362259: Pushed
0bf46d22197f: Pushed
ec76253c4a13: Pushed
cb8169ae9da4: Pushed
147da672cee9: Pushed
659b5fea55d9: Pushed
75bb19b3d7ee: Pushed
f42324cfd4a5: Pushed
83f416a905e1: Pushed
615950b7fc1a: Pushed
0a41459588e0: Layer already exists
30e908a38a18: Layer already exists
d2c4886e7c6c: Pushed
cac2fff6ae3d: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
a13c519c6361: Layer already exists
bafdbe68e4ae: Layer already exists
07c6267056f9: Pushed
63263f35234e: Pushed
37a6d0a26f1a: Pushed
20220505124330: digest: sha256:d4e6504699ae8d3d63fa7d77206b7f7a8342e11ecb4c985ac8cfb0f43a6acde5 size: 4935

> Task :sdks:java:testing:load-tests:run
May 05, 2022 12:45:25 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 05, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 221 files. Enable logging at DEBUG level to see which files will be staged.
May 05, 2022 12:45:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 05, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
May 05, 2022 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 221 files from PipelineOptions.filesToStage to staging location to prepare for execution.
May 05, 2022 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 221 files cached, 0 files newly uploaded in 0 seconds
May 05, 2022 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
May 05, 2022 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <119943 bytes, hash 73ac52ce91709873524cf81c3aeaf9f00de5c3fb9cdfee5279b6e7bc90797e4d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-c6xSzpFwmHNSTPgcOur58A3lw_uc3-5SebbnvJB5fk0.pb
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
May 05, 2022 12:45:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@659feb22, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3468ee6e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4b98f6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@421def93, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58c1da09, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b2954e1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58d6e55a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@751ae8a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@235d659c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4232b34a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da16263, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5ce0bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47e51549, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@101a461c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@360e9c06, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ebffb44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@311ff287, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7377781e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31db34da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@109f8c7e]
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
May 05, 2022 12:45:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bf54172, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c9a6717, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b3cde6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d091cad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c663eaf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bb5ceb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e692555, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ba0ae41, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76fe6cdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2ffb3aec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@786ff1cb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46039a21, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@431e86b1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35c4e864, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32a2a6be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@682af059, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f36c8e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4da39ca9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a9344f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5584d9c6]
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-05_05_45_32-315263199411405593?project=apache-beam-testing
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-05_05_45_32-315263199411405593
May 05, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-05-05_05_45_32-315263199411405593
May 05, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-05T12:45:37.866Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-05-l673. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:43.936Z: Worker configuration: e2-standard-2 in us-central1-b.
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:44.713Z: Expanding SplittableParDo operations into optimizable parts.
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:44.751Z: Expanding CollectionToSingleton operations into optimizable parts.
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:44.849Z: Expanding CoGroupByKey operations into optimizable parts.
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:44.936Z: Expanding SplittableProcessKeyed operations into optimizable parts.
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:44.967Z: Expanding GroupByKey operations into streaming Read/Write steps
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.043Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.393Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.481Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.586Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.661Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.730Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.788Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.829Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.863Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.918Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
May 05, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:45.958Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.002Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.045Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.111Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.144Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.237Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.269Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.301Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.332Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.364Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.413Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.449Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.483Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.516Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:46.703Z: Running job using Streaming Engine
May 05, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:47.029Z: Starting 5 ****s in us-central1-b...
May 05, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:45:54.157Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 05, 2022 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:46:09.837Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 05, 2022 12:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T12:47:17.612Z: Workers have started successfully.
May 05, 2022 4:01:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T16:00:59.913Z: Cancel request is committed for workflow job: 2022-05-05_05_45_32-315263199411405593.
May 05, 2022 4:01:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T16:01:05.996Z: Cleaning up.
May 05, 2022 4:01:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T16:01:06.078Z: Stopping **** pool...
May 05, 2022 4:01:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T16:01:06.166Z: Stopping **** pool...
May 05, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T16:01:40.684Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
May 05, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-05T16:01:41.375Z: Worker pool stopped.
May 05, 2022 4:01:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-05-05_05_45_32-315263199411405593 finished with status CANCELLED.
Load test results for test (ID): 9b0248a7-2e8b-44d0-afab-f9aca102ef82 and timestamp: 2022-05-05T12:45:26.264000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11596.013
dataflow_v2_java11_total_bytes_count             2.14012333E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220505124330
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d4e6504699ae8d3d63fa7d77206b7f7a8342e11ecb4c985ac8cfb0f43a6acde5
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220505124330]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d4e6504699ae8d3d63fa7d77206b7f7a8342e11ecb4c985ac8cfb0f43a6acde5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220505124330] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d4e6504699ae8d3d63fa7d77206b7f7a8342e11ecb4c985ac8cfb0f43a6acde5])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d4e6504699ae8d3d63fa7d77206b7f7a8342e11ecb4c985ac8cfb0f43a6acde5
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d4e6504699ae8d3d63fa7d77206b7f7a8342e11ecb4c985ac8cfb0f43a6acde5
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d4e6504699ae8d3d63fa7d77206b7f7a8342e11ecb4c985ac8cfb0f43a6acde5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 31s
109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/2gvpgftvcjjfy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #319

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/319/display/redirect?page=changes>

Changes:

[noreply] fix: JDBC config schema fields order

[Brian Hulette] Revert "Merge pull request #17255 from kileys/test-revert"

[Brian Hulette] BEAM-14231: bypass schema cache for

[noreply] [BEAM-13657] Follow up update version warning in __init__ (#17493)

[noreply] Merge pull request #17431 from [BEAM-14273] Add integration tests for BQ

[noreply] Merge pull request #17205 from [BEAM-14145] [Website] add carousel to

[noreply] [BEAM-14064] fix es io windowing (#17112)

[noreply] [BEAM-13670] Upgraded ipython from v7 to v8 (#17529)

[noreply] [BEAM-11104] Enable ProcessContinuation return values, add unit test

[Robert Bradshaw] [BEAM-14403] Allow Prime to be used with legacy workers.

[noreply] [BEAM-11106] Support drain in Go SDK (#17432)

[noreply] add __Init__ to inference. (#17514)


------------------------------------------
[...truncated 50.46 KB...]
30e908a38a18: Preparing
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
e5738d34aa24: Waiting
d9d480801519: Waiting
0a41459588e0: Waiting
c962695054f6: Waiting
30e908a38a18: Waiting
d7872b2348cd: Waiting
4b5e625f91be: Waiting
bafdbe68e4ae: Waiting
1441018512ca: Waiting
a13c519c6361: Waiting
5e8f247e9dd6: Waiting
4b1836b1437e: Waiting
cac2fff6ae3d: Waiting
08fa02ce37eb: Waiting
a037458de4e0: Waiting
5308e6740826: Waiting
dc778a8d307e: Waiting
6c55f04777f7: Pushed
1b5b6fa79b96: Pushed
1829b8929775: Pushed
24b39202fa9b: Pushed
2e5c453e4083: Pushed
1441018512ca: Pushed
d7872b2348cd: Pushed
dc778a8d307e: Pushed
4b1836b1437e: Pushed
4b5e625f91be: Pushed
5e8f247e9dd6: Pushed
0a41459588e0: Layer already exists
30e908a38a18: Layer already exists
cac2fff6ae3d: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
c962695054f6: Pushed
d9d480801519: Pushed
e5738d34aa24: Pushed
5308e6740826: Pushed
20220504124329: digest: sha256:d8a3b9c56b1044e54ce1ae0aac184a3e7baae699118ebf3eeae603951f8fbd58 size: 4935

> Task :sdks:java:testing:load-tests:run
May 04, 2022 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 04, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 221 files. Enable logging at DEBUG level to see which files will be staged.
May 04, 2022 12:45:29 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 04, 2022 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
May 04, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 221 files from PipelineOptions.filesToStage to staging location to prepare for execution.
May 04, 2022 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 221 files cached, 0 files newly uploaded in 1 seconds
May 04, 2022 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
May 04, 2022 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <119943 bytes, hash 14d9b17a1a2a38efd504bda6ff62ed7635e41b9b8bd7299361eda1b8d66bed57> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FNmxehoqOO_VBL2m_2LtdjXkG5uL1ymTYe2huNZr7Vc.pb
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
May 04, 2022 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@659feb22, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3468ee6e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4b98f6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@421def93, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58c1da09, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b2954e1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58d6e55a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@751ae8a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@235d659c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4232b34a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da16263, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5ce0bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47e51549, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@101a461c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@360e9c06, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ebffb44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@311ff287, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7377781e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31db34da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@109f8c7e]
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
May 04, 2022 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bf54172, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c9a6717, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b3cde6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d091cad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c663eaf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bb5ceb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e692555, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ba0ae41, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76fe6cdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2ffb3aec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@786ff1cb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46039a21, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@431e86b1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35c4e864, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32a2a6be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@682af059, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f36c8e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4da39ca9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a9344f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5584d9c6]
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 04, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.39.0-SNAPSHOT
May 04, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-04_05_45_35-9320762269807125234?project=apache-beam-testing
May 04, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-04_05_45_35-9320762269807125234
May 04, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-05-04_05_45_35-9320762269807125234
May 04, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-04T12:45:40.475Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-05-58ak. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 04, 2022 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:46.925Z: Worker configuration: e2-standard-2 in us-central1-b.
May 04, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:47.725Z: Expanding SplittableParDo operations into optimizable parts.
May 04, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:47.850Z: Expanding CollectionToSingleton operations into optimizable parts.
May 04, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:48.117Z: Expanding CoGroupByKey operations into optimizable parts.
May 04, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:48.373Z: Expanding SplittableProcessKeyed operations into optimizable parts.
May 04, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:48.535Z: Expanding GroupByKey operations into streaming Read/Write steps
May 04, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:48.731Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:48.943Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:48.987Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.031Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.058Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.089Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.125Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.150Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.188Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.231Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.267Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.292Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.332Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.378Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.484Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.691Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.799Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.894Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:49.964Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:50.071Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:50.154Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:50.217Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 04, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:50.283Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
May 04, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:50.314Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 04, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:50.507Z: Running job using Streaming Engine
May 04, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:45:50.801Z: Starting 5 ****s in us-central1-b...
May 04, 2022 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:46:05.346Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 04, 2022 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:46:13.661Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 04, 2022 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T12:47:24.901Z: Workers have started successfully.
May 04, 2022 4:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T16:00:59.914Z: Cancel request is committed for workflow job: 2022-05-04_05_45_35-9320762269807125234.
May 04, 2022 4:01:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T16:01:06.215Z: Cleaning up.
May 04, 2022 4:01:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T16:01:06.351Z: Stopping **** pool...
May 04, 2022 4:01:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T16:01:06.404Z: Stopping **** pool...
May 04, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T16:01:42.406Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
May 04, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-04T16:01:42.460Z: Worker pool stopped.
May 04, 2022 4:01:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-05-04_05_45_35-9320762269807125234 finished with status CANCELLED.
Load test results for test (ID): 443d8319-d093-4229-9f73-e483cb3a0d2c and timestamp: 2022-05-04T12:45:28.640000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11585.746
dataflow_v2_java11_total_bytes_count             1.45309165E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220504124329
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8a3b9c56b1044e54ce1ae0aac184a3e7baae699118ebf3eeae603951f8fbd58
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220504124329]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8a3b9c56b1044e54ce1ae0aac184a3e7baae699118ebf3eeae603951f8fbd58]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220504124329] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8a3b9c56b1044e54ce1ae0aac184a3e7baae699118ebf3eeae603951f8fbd58])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8a3b9c56b1044e54ce1ae0aac184a3e7baae699118ebf3eeae603951f8fbd58
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8a3b9c56b1044e54ce1ae0aac184a3e7baae699118ebf3eeae603951f8fbd58
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8a3b9c56b1044e54ce1ae0aac184a3e7baae699118ebf3eeae603951f8fbd58].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 31s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/byic2zf5526ik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #318

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/318/display/redirect?page=changes>

Changes:

[yathu] [BEAM-14375] Fix Java Wordcount Dataflow postcommit

[Robert Bradshaw] Allow arithmetic between deferred scalars.

[noreply] [BEAM-14390] Set user-agent when pulling licenses to avoid 403s (#17521)

[noreply] [BEAM-8688] Upgrade GCSIO to 2.2.6 (#17486)

[noreply] [BEAM-14253] patch SubscriptionPartitionLoader to work around a dataflow

[noreply] Add website link log to notify user of pre-build workflow. (#17498)

[noreply] [BEAM-11105] Add timestamp observing watermark estimation (#17476)

[noreply] Merge pull request #17487 from Adding user-agent to GCS client in Python

[noreply] [BEAM-10265] Display error message if trying to infer recursive schema

[noreply] [BEAM-12575] Upgraded ipykernel from v5 to v6 (#17526)

[noreply] [BEAM-11105] Add docs + CHANGES.md entry for Go Watermark Estimation

[noreply] Merge pull request #17380 from [BEAM-14314][BEAM-9532] Add last_updated


------------------------------------------
[...truncated 51.83 KB...]
0a41459588e0: Layer already exists
30e908a38a18: Layer already exists
ea12b6728cfd: Pushed
cac2fff6ae3d: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
7ab7ddbffdd9: Pushed
0ad4f886cb85: Pushed
e3817dfcabad: Pushed
20220503124335: digest: sha256:c8fd29f18ae9cd4c35079dadd0546583a666dd3bc1827f0935ec365990e097cc size: 4935

> Task :sdks:java:testing:load-tests:run
May 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 03, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 221 files. Enable logging at DEBUG level to see which files will be staged.
May 03, 2022 12:45:40 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 03, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
May 03, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 221 files from PipelineOptions.filesToStage to staging location to prepare for execution.
May 03, 2022 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 221 files cached, 0 files newly uploaded in 0 seconds
May 03, 2022 12:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
May 03, 2022 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <119943 bytes, hash 916c5088b3bd03a87ddb8851e5d10a9c1e326c5a3f2545d12c240ea3e199467c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kWxQiLO9A6h924hR5dEKnB4ybFo_JUXRLCQOo-GZRnw.pb
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
May 03, 2022 12:45:49 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@659feb22, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3468ee6e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4b98f6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@421def93, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58c1da09, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b2954e1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58d6e55a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@751ae8a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@235d659c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4232b34a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da16263, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5ce0bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47e51549, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@101a461c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@360e9c06, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ebffb44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@311ff287, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7377781e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31db34da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@109f8c7e]
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
May 03, 2022 12:45:49 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bf54172, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c9a6717, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b3cde6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d091cad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c663eaf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bb5ceb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e692555, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ba0ae41, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76fe6cdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2ffb3aec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@786ff1cb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46039a21, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@431e86b1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35c4e864, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32a2a6be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@682af059, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f36c8e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4da39ca9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a9344f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5584d9c6]
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 03, 2022 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.39.0-SNAPSHOT
May 03, 2022 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-03_05_45_49-12202947964391131369?project=apache-beam-testing
May 03, 2022 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-03_05_45_49-12202947964391131369
May 03, 2022 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-05-03_05_45_49-12202947964391131369
May 03, 2022 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-03T12:45:56.412Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-05-tiz5. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 03, 2022 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:01.823Z: Worker configuration: e2-standard-2 in us-central1-b.
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:02.649Z: Expanding SplittableParDo operations into optimizable parts.
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:02.677Z: Expanding CollectionToSingleton operations into optimizable parts.
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:02.736Z: Expanding CoGroupByKey operations into optimizable parts.
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:02.813Z: Expanding SplittableProcessKeyed operations into optimizable parts.
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:02.847Z: Expanding GroupByKey operations into streaming Read/Write steps
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:02.892Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:02.981Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.008Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.035Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.067Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.102Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.124Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.146Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.181Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.206Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.228Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.260Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.296Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.357Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.387Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.419Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.442Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.474Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.507Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.529Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.576Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.611Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.633Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.665Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:03.836Z: Running job using Streaming Engine
May 03, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:04.175Z: Starting 5 ****s in us-central1-b...
May 03, 2022 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:19.910Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 03, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:46:26.469Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
May 03, 2022 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T12:47:36.901Z: Workers have started successfully.
May 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T16:01:02.422Z: Cancel request is committed for workflow job: 2022-05-03_05_45_49-12202947964391131369.
May 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T16:01:02.525Z: Cleaning up.
May 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T16:01:02.685Z: Stopping **** pool...
May 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T16:01:02.742Z: Stopping **** pool...
May 03, 2022 4:01:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T16:01:38.352Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
May 03, 2022 4:01:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-03T16:01:38.407Z: Worker pool stopped.
May 03, 2022 4:01:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-05-03_05_45_49-12202947964391131369 finished with status CANCELLED.
Load test results for test (ID): 3fa53456-e193-4a40-9194-c44fbdaa3b9f and timestamp: 2022-05-03T12:45:39.680000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11576.192
dataflow_v2_java11_total_bytes_count             1.98093633E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220503124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8fd29f18ae9cd4c35079dadd0546583a666dd3bc1827f0935ec365990e097cc
Deleted: sha256:77797acbc43399e9ab657faa73b4cba9807d6c72bb6e46f6edea30eb2fa0fb9f
Deleted: sha256:8a55d4f92c0d1cca22085afea7c272e14fcc65d65cf7406939a935b7a5445bc5
Deleted: sha256:43d41362ec0ac58fd8cab074623c8ece1a77c1ad175723c740c5338faf21902e
Deleted: sha256:f6a5a984d823a47f549631f558ba6473d3077e90ca0029caa0b2674e29f67cc9
Deleted: sha256:f87d3ea36cabcece3632ff4a62d7091a3703d64e35baa0a84d0713b705a2818b
Deleted: sha256:3d871e7fee474850a8251731aea8433c4def3080ffcd2501edb9d7fbe0975c92
Deleted: sha256:0aaad32c70dbaea14fce3a4f010e16593908040df1a844ba4b2e591186f2c4cc
Deleted: sha256:f8486df3428b6ae1406bcee567b89d7d3392d33d9988d035e6b4e31cf85b9f24
Deleted: sha256:b27db9e1945e72a938e52ec363f8ad7423e67c448c0aa44e4c5b337514747f12
Deleted: sha256:5176ee6212ed61d21b334e84dfcf43deea67babcf1de8d524a6eb9ab14e4e61e
Deleted: sha256:d32951a414779750125b535f6f08d6c3817e7b0d8989b76cb06e1a2e95ebe8e8
Deleted: sha256:ffd241957a7ce08f933a7e75b75fa72d2871e21741655cec76971e05a0d860dc
Deleted: sha256:73b82c63a5d99feb4612ca2ee42d43a1fa7beb1eccbff55baeed5d91da116907
Deleted: sha256:1cca4232cf482ab799bbea72eb1b51d44fddef57dc842a8aaf1eae216d8ae5d6
Deleted: sha256:605c3b170015ecbb3c6e1525a7eb5fdb7d27acde7fde0ddc56abae17c2d3a381
Deleted: sha256:97c7c9f7a8d43eb8b113638f65eb10088181a8acfd35ddd2f75f97072903051e
Deleted: sha256:b23d222479d8e24cfa92c87f5a5cb2b6a6ca233fbe656481851d82ac00d3986a
Deleted: sha256:ead84806d518e2168b678c1a022b902814c83ceeb9c3e5c724f7fbc3392bc04a
Deleted: sha256:df04bb64453acf8be71a291c6c632208b2e76244ac906e66e8bc5b1916d4108d
Deleted: sha256:55434fb3f2a9bcaaefb3192b477dd1091d9afd5451ccd25e5f4bfef87199644f
Deleted: sha256:2281a2de74dba2c958db859fb9ae334fe151280279908fe0fd6ff638712d9acb
Deleted: sha256:0c3696bd0f69116745d3c8429911fea097659fe73cc99306aed57d51673e993c
Deleted: sha256:befdca89fddbb2471554d829fb8c4bf5948fdfa4e97b92fa835e215acb818b87
Deleted: sha256:2a981d8c87fb6091f2487e4d0966937aba5411c083222b0a4fbf1ff2af2a4774
Deleted: sha256:35cbd9885a01f93cd487130ca3d47f620e32d4fcdb6fd99b35d7a773217f0032
Deleted: sha256:b48baa8cf4619fb4a4e9d456e8f2abfe50ffc386d04ef63ab972a52969aa2a1b
Deleted: sha256:995ad8ccb14a171a3dbf1cb759f8b3533aabd55c58c82b47f861ed51a92a5da9
Deleted: sha256:ec39766a8bf313e75267f2410fac32775899dfe9e2f7056bcf6c73212feffb79
Deleted: sha256:050565a41472653c33b0f77947085b0eca61b026b9afb55d3a0e3710616cac33
Deleted: sha256:dbcb31eb4337d3773fa9ee4cf4bf715cc0bb70ba6f4fcc099294c9589420dafe
Deleted: sha256:dad918dce7b375c4ba5a7291cf2291565d1f1f871e2921ae19ba953204e3bf69
Deleted: sha256:b00d4e7ce815aac593f4d0e3c3e8f3c3ffca5f59bb1e864887c1342f553a3bea
Deleted: sha256:468f275b66fbe4d1ce03751d587303813b73973d4e729f7661580162b4666b63
Deleted: sha256:2b710a722c03b3b01561f12037ed457a612689e160d9953dee9756338c147d5e
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220503124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8fd29f18ae9cd4c35079dadd0546583a666dd3bc1827f0935ec365990e097cc]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220503124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8fd29f18ae9cd4c35079dadd0546583a666dd3bc1827f0935ec365990e097cc])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8fd29f18ae9cd4c35079dadd0546583a666dd3bc1827f0935ec365990e097cc
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8fd29f18ae9cd4c35079dadd0546583a666dd3bc1827f0935ec365990e097cc
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8fd29f18ae9cd4c35079dadd0546583a666dd3bc1827f0935ec365990e097cc].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 45s
109 actionable tasks: 75 executed, 30 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/5w2pir4tqm5ji

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #317

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/317/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11105] Add manual watermark estimation (#17475)


------------------------------------------
[...truncated 388.27 KB...]
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for slf4j-simple-1.7.30: http://www.opensource.org/licenses/mit-license.php after 9 retries.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for jdbc-1.16.3: http://opensource.org/licenses/MIT after 9 retries.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for postgresql-1.16.3: http://opensource.org/licenses/MIT after 9 retries.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT after 9 retries.
ERROR:root:['google-auth-library-oauth2-http-1.4.0', 'reflectasm-1.07', 'system-rules-1.19.0', 'zstd-jni-1.4.3-1', 'kryo-2.21', 'software-and-algorithms-1.0', 'junit-quickcheck-generators-0.8', 'google-auth-library-credentials-1.4.0', 'protobuf-java-util-3.19.3', 'classgraph-4.8.104', 'protobuf-java-3.19.3', 'grpc-context-1.44.0', 'grpc-protobuf-1.44.0', 'grpc-alts-1.44.0', 'perfmark-api-0.23.0', 'junit-dep-4.11', 'minlog-1.2', 'zstd-jni-1.5.2-1', 'junit-quickcheck-core-0.8', 'checker-compat-qual-2.5.3', 'grpc-core-1.44.0', 'grpc-api-1.44.0', 'grpc-protobuf-lite-1.44.0', 'duct-tape-1.0.8', 'pcollections-2.1.2', 'hamcrest-2.1', 'slf4j-jdk14-1.7.30', 'mysql-1.16.3', 'database-commons-1.16.3', 'checker-compat-qual-2.5.5', 'grpc-grpclb-1.44.0', 'slf4j-api-1.7.30', 'grpc-auth-1.44.0', 'grpc-stub-1.44.0', 'slf4j-simple-1.7.30', 'jdbc-1.16.3', 'postgresql-1.16.3', 'kafka-1.16.3']
ERROR:root:**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checker-compat-qual-2.5.3,checker-compat-qual-2.5.5,classgraph-4.8.104,database-commons-1.16.3,duct-tape-1.0.8,google-auth-library-credentials-1.4.0,google-auth-library-oauth2-http-1.4.0,grpc-alts-1.44.0,grpc-api-1.44.0,grpc-auth-1.44.0,grpc-context-1.44.0,grpc-core-1.44.0,grpc-grpclb-1.44.0,grpc-protobuf-1.44.0,grpc-protobuf-lite-1.44.0,grpc-stub-1.44.0,hamcrest-2.1,jdbc-1.16.3,junit-dep-4.11,junit-quickcheck-core-0.8,junit-quickcheck-generators-0.8,kafka-1.16.3,kryo-2.21,minlog-1.2,mysql-1.16.3,pcollections-2.1.2,perfmark-api-0.23.0,postgresql-1.16.3,protobuf-java-3.19.3,protobuf-java-util-3.19.3,reflectasm-1.07,slf4j-api-1.7.30,slf4j-jdk14-1.7.30,slf4j-simple-1.7.30,software-and-algorithms-1.0,system-rules-1.19.0,zstd-jni-1.4.3-1,zstd-jni-1.5.2-1]
INFO:root:pull_licenses_java.py failed. It took 162.515366 seconds with 16 threads.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 321, in <module>
    raise RuntimeError('{n} error(s) occurred.'.format(n=len(error_msg)),
RuntimeError: ('1 error(s) occurred.', ['**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checker-compat-qual-2.5.3,checker-compat-qual-2.5.5,classgraph-4.8.104,database-commons-1.16.3,duct-tape-1.0.8,google-auth-library-credentials-1.4.0,google-auth-library-oauth2-http-1.4.0,grpc-alts-1.44.0,grpc-api-1.44.0,grpc-auth-1.44.0,grpc-context-1.44.0,grpc-core-1.44.0,grpc-grpclb-1.44.0,grpc-protobuf-1.44.0,grpc-protobuf-lite-1.44.0,grpc-stub-1.44.0,hamcrest-2.1,jdbc-1.16.3,junit-dep-4.11,junit-quickcheck-core-0.8,junit-quickcheck-generators-0.8,kafka-1.16.3,kryo-2.21,minlog-1.2,mysql-1.16.3,pcollections-2.1.2,perfmark-api-0.23.0,postgresql-1.16.3,protobuf-java-3.19.3,protobuf-java-util-3.19.3,reflectasm-1.07,slf4j-api-1.7.30,slf4j-jdk14-1.7.30,slf4j-simple-1.7.30,software-and-algorithms-1.0,system-rules-1.19.0,zstd-jni-1.4.3-1,zstd-jni-1.5.2-1]'])

> Task :sdks:java:container:pullLicenses FAILED
> Task :sdks:java:container:goPrepare UP-TO-DATE

> Task :sdks:java:container:goBuild
/home/jenkins/go/bin/go1.16.12 build -o ./build/target/linux_amd64/boot boot.go

> Task :sdks:java:container:java11:copySdkHarnessLauncher
Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons:
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem.
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3m 17s
103 actionable tasks: 66 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/3cuhbfu6wwvng

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #316

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/316/display/redirect?page=changes>

Changes:

[noreply] Revert "Improvement to Seed job configuration to launch against PRs

[ilion.beyst] Minor: fix typo

[noreply] Merge pull request #17422 from [BEAM-14344]: remove tracing from


------------------------------------------
[...truncated 387.85 KB...]
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for grpc-auth-1.44.0: https://opensource.org/licenses/Apache-2.0 after 9 retries.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for jdbc-1.16.3: http://opensource.org/licenses/MIT after 9 retries.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for postgresql-1.16.3: http://opensource.org/licenses/MIT after 9 retries.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT. Retrying...
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
ERROR:root:Invalid url for kafka-1.16.3: http://opensource.org/licenses/MIT after 9 retries.
ERROR:root:['google-auth-library-oauth2-http-1.4.0', 'system-rules-1.19.0', 'reflectasm-1.07', 'zstd-jni-1.4.3-1', 'software-and-algorithms-1.0', 'google-auth-library-credentials-1.4.0', 'junit-quickcheck-generators-0.8', 'kryo-2.21', 'protobuf-java-util-3.19.3', 'classgraph-4.8.104', 'protobuf-java-3.19.3', 'grpc-context-1.44.0', 'grpc-protobuf-1.44.0', 'grpc-alts-1.44.0', 'perfmark-api-0.23.0', 'junit-dep-4.11', 'zstd-jni-1.5.2-1', 'minlog-1.2', 'junit-quickcheck-core-0.8', 'checker-compat-qual-2.5.3', 'grpc-core-1.44.0', 'grpc-protobuf-lite-1.44.0', 'grpc-api-1.44.0', 'pcollections-2.1.2', 'duct-tape-1.0.8', 'hamcrest-2.1', 'slf4j-jdk14-1.7.30', 'mysql-1.16.3', 'database-commons-1.16.3', 'checker-compat-qual-2.5.5', 'grpc-grpclb-1.44.0', 'grpc-stub-1.44.0', 'slf4j-simple-1.7.30', 'slf4j-api-1.7.30', 'grpc-auth-1.44.0', 'jdbc-1.16.3', 'postgresql-1.16.3', 'kafka-1.16.3']
ERROR:root:**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checker-compat-qual-2.5.3,checker-compat-qual-2.5.5,classgraph-4.8.104,database-commons-1.16.3,duct-tape-1.0.8,google-auth-library-credentials-1.4.0,google-auth-library-oauth2-http-1.4.0,grpc-alts-1.44.0,grpc-api-1.44.0,grpc-auth-1.44.0,grpc-context-1.44.0,grpc-core-1.44.0,grpc-grpclb-1.44.0,grpc-protobuf-1.44.0,grpc-protobuf-lite-1.44.0,grpc-stub-1.44.0,hamcrest-2.1,jdbc-1.16.3,junit-dep-4.11,junit-quickcheck-core-0.8,junit-quickcheck-generators-0.8,kafka-1.16.3,kryo-2.21,minlog-1.2,mysql-1.16.3,pcollections-2.1.2,perfmark-api-0.23.0,postgresql-1.16.3,protobuf-java-3.19.3,protobuf-java-util-3.19.3,reflectasm-1.07,slf4j-api-1.7.30,slf4j-jdk14-1.7.30,slf4j-simple-1.7.30,software-and-algorithms-1.0,system-rules-1.19.0,zstd-jni-1.4.3-1,zstd-jni-1.5.2-1]
INFO:root:pull_licenses_java.py failed. It took 162.386725 seconds with 16 threads.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 321, in <module>
    raise RuntimeError('{n} error(s) occurred.'.format(n=len(error_msg)),
RuntimeError: ('1 error(s) occurred.', ['**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checker-compat-qual-2.5.3,checker-compat-qual-2.5.5,classgraph-4.8.104,database-commons-1.16.3,duct-tape-1.0.8,google-auth-library-credentials-1.4.0,google-auth-library-oauth2-http-1.4.0,grpc-alts-1.44.0,grpc-api-1.44.0,grpc-auth-1.44.0,grpc-context-1.44.0,grpc-core-1.44.0,grpc-grpclb-1.44.0,grpc-protobuf-1.44.0,grpc-protobuf-lite-1.44.0,grpc-stub-1.44.0,hamcrest-2.1,jdbc-1.16.3,junit-dep-4.11,junit-quickcheck-core-0.8,junit-quickcheck-generators-0.8,kafka-1.16.3,kryo-2.21,minlog-1.2,mysql-1.16.3,pcollections-2.1.2,perfmark-api-0.23.0,postgresql-1.16.3,protobuf-java-3.19.3,protobuf-java-util-3.19.3,reflectasm-1.07,slf4j-api-1.7.30,slf4j-jdk14-1.7.30,slf4j-simple-1.7.30,software-and-algorithms-1.0,system-rules-1.19.0,zstd-jni-1.4.3-1,zstd-jni-1.5.2-1]'])

> Task :sdks:java:container:pullLicenses FAILED
> Task :sdks:java:container:goPrepare UP-TO-DATE

> Task :sdks:java:container:goBuild
/home/jenkins/go/bin/go1.16.12 build -o ./build/target/linux_amd64/boot boot.go

> Task :sdks:java:container:java11:copySdkHarnessLauncher
Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons:
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem.
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3m 17s
103 actionable tasks: 67 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ed5gfarnacgwo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #315

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/315/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Add element weighting parameter to BatchElements.

[Robert Bradshaw] Clearer test.

[noreply] Revert "Merge pull request #17260 from [BEAM-13229] [Website] bug side

[noreply] [BEAM-14001] Add missing test cases to existing suites in exec package

[noreply] [BEAM-14243] Add staticcheck to Github Actions Precommits (#17479)

[noreply] [BEAM-14368][BEAM-13984]Change model loading from constructor to

[noreply] [BEAM-13983] changed file name from sklearn_loader to sklearn_inference

[noreply] Add SQL in Notebooks blog post (#17481)

[noreply] Merge pull request #17404: [BEAM-13990] support date and timestamp


------------------------------------------
[...truncated 49.70 KB...]
0a41459588e0: Preparing
30e908a38a18: Preparing
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
08fa02ce37eb: Waiting
07878e59dd1b: Waiting
bf6a5282bbb4: Waiting
a037458de4e0: Waiting
bb613443e375: Waiting
0a41459588e0: Waiting
bafdbe68e4ae: Waiting
1bae9d4ecbd0: Waiting
cac2fff6ae3d: Waiting
30e908a38a18: Waiting
a13c519c6361: Waiting
9bb318ede142: Waiting
aab977f249d6: Waiting
1644b5f577fe: Waiting
df8d0580d3cc: Waiting
7ca2d16964e0: Waiting
f737f5fea4b9: Pushed
f8532c307973: Pushed
019aadf738f8: Pushed
bcd8d309bc5b: Pushed
8087b227a860: Pushed
07878e59dd1b: Pushed
bb613443e375: Pushed
9bb318ede142: Pushed
7ca2d16964e0: Pushed
aab977f249d6: Pushed
1bae9d4ecbd0: Pushed
df8d0580d3cc: Pushed
0a41459588e0: Layer already exists
30e908a38a18: Layer already exists
cac2fff6ae3d: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
bf6a5282bbb4: Pushed
0df6135c1144: Pushed
1644b5f577fe: Pushed
20220430124337: digest: sha256:615747e4753b65b4840b1ff25870dbcffc794c4418f115dafe04965b96d1b7eb size: 4935

> Task :sdks:java:testing:load-tests:run
Apr 30, 2022 12:45:34 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Apr 30, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 210 files. Enable logging at DEBUG level to see which files will be staged.
Apr 30, 2022 12:45:35 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Apr 30, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Apr 30, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Apr 30, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 210 files cached, 0 files newly uploaded in 0 seconds
Apr 30, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Apr 30, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <116094 bytes, hash c5482fee9c8dc9c2ac2a08e39b414da1e588926dc6fe48effb204ad8b081859c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xUgv7pyNycKsKgjjm0FNoeWIkm3G_kjv-yBK2LCBhZw.pb
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Apr 30, 2022 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5af0a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5981f4a6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63dfada0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f231ced, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35a60674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63d4f0a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d78f3d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a4b5ce3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5b6e78, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b4eced1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71926a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@216e9ca3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@75120e58, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48976e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a367e93, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f6874f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a6dc589, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@697a34af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70211df5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c5228e7]
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Apr 30, 2022 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b81616b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15d42ccb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@279dd959, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46383a78, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@36c281ed, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@244418a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b5a078a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c361f63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ed922e1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4eb166a1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@554c4eaa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fd8e67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e146f93, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd5849e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7cdbaa50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39909d1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1455154c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7343922c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@526b2f3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f2e1024]
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Apr 30, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.39.0-SNAPSHOT
Apr 30, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-30_05_45_41-13589612111199287465?project=apache-beam-testing
Apr 30, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-04-30_05_45_41-13589612111199287465
Apr 30, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-04-30_05_45_41-13589612111199287465
Apr 30, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-04-30T12:45:47.320Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-04-986k. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:52.252Z: Worker configuration: e2-standard-2 in us-central1-f.
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.090Z: Expanding SplittableParDo operations into optimizable parts.
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.161Z: Expanding CollectionToSingleton operations into optimizable parts.
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.311Z: Expanding CoGroupByKey operations into optimizable parts.
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.411Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.441Z: Expanding GroupByKey operations into streaming Read/Write steps
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.509Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.640Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.677Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Apr 30, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.721Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.755Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.802Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.839Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.919Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:53.955Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.003Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.034Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.069Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.140Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.177Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.258Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.305Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.337Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.374Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.407Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.443Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.476Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.504Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.541Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.610Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:54.783Z: Running job using Streaming Engine
Apr 30, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:45:55.071Z: Starting 5 ****s in us-central1-f...
Apr 30, 2022 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:46:17.841Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Apr 30, 2022 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:46:21.277Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Apr 30, 2022 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T12:47:24.951Z: Workers have started successfully.
Apr 30, 2022 4:01:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T16:01:09.181Z: Cancel request is committed for workflow job: 2022-04-30_05_45_41-13589612111199287465.
Apr 30, 2022 4:01:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T16:01:14.389Z: Cleaning up.
Apr 30, 2022 4:01:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T16:01:14.465Z: Stopping **** pool...
Apr 30, 2022 4:01:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T16:01:14.532Z: Stopping **** pool...
Apr 30, 2022 4:01:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T16:01:47.637Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 30, 2022 4:01:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-30T16:01:47.669Z: Worker pool stopped.
Apr 30, 2022 4:01:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-30_05_45_41-13589612111199287465 finished with status CANCELLED.
Load test results for test (ID): e92feb79-19c9-4431-83d6-e0e12b958e60 and timestamp: 2022-04-30T12:45:35.475000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11603.125
dataflow_v2_java11_total_bytes_count             3.24106316E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220430124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:615747e4753b65b4840b1ff25870dbcffc794c4418f115dafe04965b96d1b7eb
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220430124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:615747e4753b65b4840b1ff25870dbcffc794c4418f115dafe04965b96d1b7eb]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220430124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:615747e4753b65b4840b1ff25870dbcffc794c4418f115dafe04965b96d1b7eb])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:615747e4753b65b4840b1ff25870dbcffc794c4418f115dafe04965b96d1b7eb
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:615747e4753b65b4840b1ff25870dbcffc794c4418f115dafe04965b96d1b7eb
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:615747e4753b65b4840b1ff25870dbcffc794c4418f115dafe04965b96d1b7eb].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 38s
109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/mzrj4sweclqkq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #314

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/314/display/redirect?page=changes>

Changes:

[ihr] Update Java katas to Beam 2.38

[noreply] [BEAM-14369] Fix "target/options: no such file or directory" error while

[noreply] [BEAM-14297] Enable nullable key and value arrays for xlang kafka io

[noreply] Merge pull request #17444 from [BEAM-14310] [Website] bug home

[noreply] Merge pull request #17388 from [BEAM-14311] [Website] Home Page

[noreply] [BEAM-14376] Typo in method description doc

[noreply] Add default classpath when not present (#17491)

[thiagotnunes] fix: update javadocs for ChangeStreamMetrics

[noreply] Merge pull request #17443 from [BEAM-12164]: use the end timestamp for

[noreply] Merge pull request #17260 from [BEAM-13229] [Website] bug side nav

[noreply] [BEAM-14351] Fix the template and move the announcement to the next


------------------------------------------
[...truncated 50.44 KB...]
30e908a38a18: Preparing
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
d5f5cd17dcb9: Waiting
3ac9ccc6299e: Waiting
30e908a38a18: Waiting
cac2fff6ae3d: Waiting
a037458de4e0: Waiting
bafdbe68e4ae: Waiting
b52aa05a9b0b: Waiting
a13c519c6361: Waiting
c9d970da0a0c: Waiting
19d07ee8cf97: Waiting
422615b92e82: Waiting
0a41459588e0: Waiting
08fa02ce37eb: Waiting
a91a9717f492: Pushed
209a345c7e0d: Pushed
74bd9c5ccf24: Pushed
58ec1fe908fc: Pushed
e4cc75bc8d67: Pushed
8d0616eadbf7: Pushed
b52aa05a9b0b: Pushed
2c93213bbf4d: Pushed
332613043c6a: Pushed
19d07ee8cf97: Pushed
c9d970da0a0c: Pushed
422615b92e82: Pushed
0a41459588e0: Layer already exists
cac2fff6ae3d: Layer already exists
08fa02ce37eb: Layer already exists
30e908a38a18: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
3ac9ccc6299e: Pushed
d5f5cd17dcb9: Pushed
5117e33db1bb: Pushed
20220429124336: digest: sha256:7727baed219d8f3cb45619757abc795b9418c2357f609af0cd31248521c97753 size: 4935

> Task :sdks:java:testing:load-tests:run
Apr 29, 2022 12:45:30 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Apr 29, 2022 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 208 files. Enable logging at DEBUG level to see which files will be staged.
Apr 29, 2022 12:45:31 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Apr 29, 2022 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Apr 29, 2022 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 208 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Apr 29, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 208 files cached, 0 files newly uploaded in 0 seconds
Apr 29, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Apr 29, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <115405 bytes, hash a91b82802dea157f9d29256145c10c965d6401d6530d3199deea64135f042a07> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-qRuCgC3qFX-dKSVhRcEMll1kAdZTDTGZ3upkE18EKgc.pb
Apr 29, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Apr 29, 2022 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59696551, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@648d0e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79e66b2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17273273, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f69e2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@984169e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f1bb92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a818392, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@489091bd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512d6e60, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1de9b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b122839, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3743539f]
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Apr 29, 2022 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c5ddccd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dbd580, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c101cc1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d0d91a1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb48179, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@201c3cda, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c86da0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d97caa4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6732726, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@474821de, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d64c581, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ec5ea63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4190bc8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47d023b7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c83ae01, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64c100, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d45cca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fdf17dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e6d4780, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@650ae78c]
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.39.0-SNAPSHOT
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-29_05_45_38-18208232720786149688?project=apache-beam-testing
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-04-29_05_45_38-18208232720786149688
Apr 29, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-04-29_05_45_38-18208232720786149688
Apr 29, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-04-29T12:45:44.889Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-04-7hp3. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Apr 29, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:45:54.766Z: Worker configuration: e2-standard-2 in us-central1-b.
Apr 29, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:45:59.821Z: Expanding SplittableParDo operations into optimizable parts.
Apr 29, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:45:59.854Z: Expanding CollectionToSingleton operations into optimizable parts.
Apr 29, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:45:59.920Z: Expanding CoGroupByKey operations into optimizable parts.
Apr 29, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:45:59.990Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Apr 29, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.048Z: Expanding GroupByKey operations into streaming Read/Write steps
Apr 29, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.122Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.243Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.285Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.319Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.344Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.381Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.411Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.444Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.477Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.562Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.599Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.634Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.658Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.666Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.701Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.733Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.770Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.798Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.823Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.856Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.880Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.913Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.947Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:00.971Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:01.006Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:01.189Z: Running job using Streaming Engine
Apr 29, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:01.463Z: Starting 5 ****s in us-central1-b...
Apr 29, 2022 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:23.944Z: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
Apr 29, 2022 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:23.977Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
Apr 29, 2022 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:46:34.185Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Apr 29, 2022 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T12:47:34.096Z: Workers have started successfully.
Apr 29, 2022 4:01:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T16:01:03.782Z: Cancel request is committed for workflow job: 2022-04-29_05_45_38-18208232720786149688.
Apr 29, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T16:01:08.529Z: Cleaning up.
Apr 29, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T16:01:08.610Z: Stopping **** pool...
Apr 29, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T16:01:08.667Z: Stopping **** pool...
Apr 29, 2022 4:01:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T16:01:43.421Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 29, 2022 4:01:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-29T16:01:43.462Z: Worker pool stopped.
Apr 29, 2022 4:01:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-29_05_45_38-18208232720786149688 finished with status CANCELLED.
Load test results for test (ID): 22a96f17-b025-4565-b1da-a6f34ddbad50 and timestamp: 2022-04-29T12:45:31.311000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11593.064
dataflow_v2_java11_total_bytes_count             3.08986423E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220429124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7727baed219d8f3cb45619757abc795b9418c2357f609af0cd31248521c97753
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220429124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7727baed219d8f3cb45619757abc795b9418c2357f609af0cd31248521c97753]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220429124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7727baed219d8f3cb45619757abc795b9418c2357f609af0cd31248521c97753])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7727baed219d8f3cb45619757abc795b9418c2357f609af0cd31248521c97753
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7727baed219d8f3cb45619757abc795b9418c2357f609af0cd31248521c97753
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7727baed219d8f3cb45619757abc795b9418c2357f609af0cd31248521c97753].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 37s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/aya6fox5ipygs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #313

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/313/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11104] Add Checkpointing split to Go SDK (#17386)

[noreply] Merge pull request #17226 from [BEAM-14204] [Playground] Tests for

[noreply] [BEAM-13015, BEAM-14184] Address unbounded number of messages being

[noreply] Improvement to Seed job configuration to launch against PRs (#17468)

[noreply] [BEAM-13983] Small changes to sklearn runinference (#17459)

[chamikaramj] Renames ExternalPythonTransform to PythonExternalTransform

[noreply] [BEAM-14351] Inherit from Coder. (#17437)


------------------------------------------
[...truncated 51.60 KB...]
Apr 28, 2022 12:45:21 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Apr 28, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 208 files. Enable logging at DEBUG level to see which files will be staged.
Apr 28, 2022 12:45:22 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Apr 28, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Apr 28, 2022 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 208 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Apr 28, 2022 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 208 files cached, 0 files newly uploaded in 0 seconds
Apr 28, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Apr 28, 2022 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <115405 bytes, hash 55dcef821fc31813a99c641f594b979ba1f8ce1e078948e520b8d4252b112b1c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Vdzvgh_DGBOpnGQfWUuXm6H4zh4HiUjlILjUJSsRKxw.pb
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Apr 28, 2022 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57272109, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59696551, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@648d0e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79e66b2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17273273, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f69e2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@984169e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f1bb92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a818392, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@489091bd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512d6e60, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1de9b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b122839]
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Apr 28, 2022 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a6f6c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c5ddccd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dbd580, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c101cc1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d0d91a1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb48179, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@201c3cda, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c86da0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d97caa4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6732726, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@474821de, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d64c581, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ec5ea63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4190bc8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47d023b7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c83ae01, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64c100, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d45cca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fdf17dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e6d4780]
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Apr 28, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.39.0-SNAPSHOT
Apr 28, 2022 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-28_05_45_28-14162986301265808434?project=apache-beam-testing
Apr 28, 2022 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-04-28_05_45_28-14162986301265808434
Apr 28, 2022 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-04-28_05_45_28-14162986301265808434
Apr 28, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-04-28T12:45:35.360Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-04-t1p0. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:39.792Z: Worker configuration: e2-standard-2 in us-central1-f.
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:40.707Z: Expanding SplittableParDo operations into optimizable parts.
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:40.733Z: Expanding CollectionToSingleton operations into optimizable parts.
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:40.841Z: Expanding CoGroupByKey operations into optimizable parts.
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:40.914Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:40.935Z: Expanding GroupByKey operations into streaming Read/Write steps
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:40.990Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.092Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Apr 28, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.119Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.169Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.214Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.236Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.268Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.303Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.328Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.353Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.386Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.412Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.445Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.476Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.502Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.535Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.583Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.617Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.651Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.718Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.811Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.835Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.869Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:41.896Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:42.070Z: Running job using Streaming Engine
Apr 28, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:42.355Z: Starting 5 ****s in us-central1-f...
Apr 28, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:45:46.374Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Apr 28, 2022 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:46:07.134Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Apr 28, 2022 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:46:07.165Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Apr 28, 2022 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:46:17.351Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Apr 28, 2022 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T12:47:17.635Z: Workers have started successfully.
Apr 28, 2022 1:18:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-28T13:18:47.120Z: Staged package gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar' is inaccessible.
Apr 28, 2022 1:18:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-04-28T13:18:49.585Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Apr 28, 2022 1:21:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-04-28T13:21:45.880Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Apr 28, 2022 4:01:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T16:01:04.987Z: Cancel request is committed for workflow job: 2022-04-28_05_45_28-14162986301265808434.
Apr 28, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T16:01:09.494Z: Cleaning up.
Apr 28, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T16:01:09.590Z: Stopping **** pool...
Apr 28, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T16:01:09.634Z: Stopping **** pool...
Apr 28, 2022 4:01:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T16:01:42.592Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 28, 2022 4:01:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-28T16:01:42.638Z: Worker pool stopped.
Apr 28, 2022 4:01:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-28_05_45_28-14162986301265808434 finished with status CANCELLED.
Load test results for test (ID): d3604223-1f18-4dab-b1ed-56c91df76e87 and timestamp: 2022-04-28T12:45:22.647000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11591.561
dataflow_v2_java11_total_bytes_count             3.63972961E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220428124328
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1fc998e42139727d2c3786f1de49786ca539cf13c81d83afa97ab6cf29387608
Deleted: sha256:e63dae9ac65c339c0916f0549055af3053164dce65c80929e20deca37dce48de
Deleted: sha256:4cb5f3e8c7fb7eb319a6d8e234ea829cd1902a9bec42f7c87733d18a27015e0c
Deleted: sha256:ddc8fa9be5a899b578a0797a0b609d32cccc60ed23bd24e51c35e1c8d6dc83ea
Deleted: sha256:d6ec4448d38cd65fdd8c10fb3ea872f28f3140f4381b9da4ba68862ae87a8033
Deleted: sha256:f808b9a8450702f142b809bfa712a453b115a57858908bcbd442fc10f5b148e9
Deleted: sha256:7ed3a05cf07daa4a77d5c5760866946d1728ab923d8f1ce993e3513ba1bc7afb
Deleted: sha256:3b6adc3b0d05fe12b9871660a7e5c709c15e4d5ff269bca3b8b2c2f85cd400cb
Deleted: sha256:c25a0f797594683f876e1748185795a6c39d2de017debfdafbd5e9cc817d0e9c
Deleted: sha256:f98b15603994e1d7b996292e3a78e886947d6989bde0ecb575437d2b62e30b91
Deleted: sha256:18aa10d17e1059af2189417e6bf8a6395882f3cc1592523282f8dce3c4fc0f20
Deleted: sha256:1be5aa0a3662b9abd63d597a3fe28064c3f327c8042320261a426a1980a10f8a
Deleted: sha256:809ec7b494cfb7e5e9952968cf72206be6856c0f99bf1bde7ce6686db8969e97
Deleted: sha256:3b9a3a94cb559fd2420be83bf07641aa8730a61caebed4811711c8d85615a4ea
Deleted: sha256:79c62ad84884ca3e1df636c7fd583d2ebc5f753147ba7ccb2d5145d0b987b7e1
Deleted: sha256:74c78829015b5b1ec7554a1d3885a61d5dfb81635eb4016f20abc6d214a2f0d3
Deleted: sha256:400ff2b77f562ed3bb5e5548dcc512e4f846361fe7a5d64d2488d482a25db624
Deleted: sha256:0e5435a594dd275692b49d956fc68cfb2f1f1e7b7c37286f204934b7a2d2c8cc
Deleted: sha256:7fd9cadf12a5e9255759a17d7eb25bb95c34ca729197ad780989188f7de00868
Deleted: sha256:d8e7cfcc9613961167573bb57211e452d35d289b11eb8654d009fa674a1defb0
Deleted: sha256:aada67f0f22e1179493580f4d7287ddf3760eeef0bcea0d21f167109fb3c06fe
Deleted: sha256:e6ae339c9c3bb5e0b6d345e969ee02ffbfc5491ba1b58e59a70f579c13664612
Deleted: sha256:3634bb3fc73b37a6f03318e5dc731c274e4da2de1f4f08dd5dd35b2bc5b430f1
Deleted: sha256:b90e82bb89c75e64f0a35cf43a97474fb7e38bac02a164e0fcd403f228a4d912
Deleted: sha256:dbe8df52f07e94bb791fb425fbeffd1d2be286714c2a0b198cbe1127914b1581
Deleted: sha256:ac6019c90a238573f59566ee6535e845c8c79952bbcfb702b12b49c2edf17a95
Deleted: sha256:7321a757b06f23cf1e556b964054f443280190b2b859ba622967a5186c0fd9b5
Deleted: sha256:72e0b9000f9e0d943777af6362acd698e4e7c14a5a6307f321d354257f62826c
Deleted: sha256:3829a78eb17ddfb13cf9f0e2111af03c80d3e2f62aa4d0e91005542f966e90fb
Deleted: sha256:eb60c9941049c0e26fc14bc16fac4e0bde0ca9fc6cdee76beb0c71c3a41e4b8e
Deleted: sha256:e02157c2fa0df1112fe6f2432dd855c517bc7a28287919080dd366b97af7330b
Deleted: sha256:313c4b59f251e68da037876d6fb956a4c1053b506f37c468b3557b7c473771ad
Deleted: sha256:de98d51faa74201b78d3853be89ca0205e37ef0100b0f41d83d1b14540912d94
Deleted: sha256:3f7a84bd2f65815930caa0a72c24ac267c047e0e2c645c7d34e3920d9a54d543
Deleted: sha256:61bebe227603a7f304e0c5c2ee710ae0203e72d1026a53a0640ff46caf22230f
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220428124328]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1fc998e42139727d2c3786f1de49786ca539cf13c81d83afa97ab6cf29387608]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220428124328] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1fc998e42139727d2c3786f1de49786ca539cf13c81d83afa97ab6cf29387608])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1fc998e42139727d2c3786f1de49786ca539cf13c81d83afa97ab6cf29387608
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1fc998e42139727d2c3786f1de49786ca539cf13c81d83afa97ab6cf29387608
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1fc998e42139727d2c3786f1de49786ca539cf13c81d83afa97ab6cf29387608].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e150bab5afd54e7eb0a9e2e9d2379ba06eab09426d968d7a5ad2e2b44cfcd34c
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e150bab5afd54e7eb0a9e2e9d2379ba06eab09426d968d7a5ad2e2b44cfcd34c
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e150bab5afd54e7eb0a9e2e9d2379ba06eab09426d968d7a5ad2e2b44cfcd34c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 41s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/gyygpqi55ckz2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #312

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/312/display/redirect?page=changes>

Changes:

[msbukal] FhirIO: use .search() or .searchType instead of .setResourceType()

[nick.caballero] [BEAM-14363] Fixes WatermarkParameters builder for Kinesis

[noreply] Remove unnecessary decorator from RunInference interface (#17463)

[noreply] [BEAM-13590] Minor deprecated warning fix (#17453)

[noreply] [BEAM-12164]: fix the negative throughput issue (#17461)

[noreply] Updated goldens for the screen diff integration tests (#17467)

[noreply] fixes copy by value error for bytes.Buffer in Error (#17469)

[noreply] Merge pull request #17354 from [BEAM-14170] - Create a test that runs

[noreply] Merge pull request #17447 from [BEAM-14357] Fix

[noreply] [BEAM-14324, BEAM-14325] Staticcheck cleanup in test files (#17393)

[noreply] BEAM-14187 Fix NPE (#17454)

[noreply] [BEAM-11105] Stateful watermark estimation (#17374)

[noreply] [BEAM-14304] implement parquetio to read/write parquet files (#17347)


------------------------------------------
[...truncated 50.80 KB...]
4226c6d08a2b: Preparing
0a41459588e0: Preparing
30e908a38a18: Preparing
cac2fff6ae3d: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
eb8c38a950c2: Waiting
a13c519c6361: Preparing
e87a173ce27c: Waiting
79a90c6af3e1: Waiting
4226c6d08a2b: Waiting
240fa7a476c7: Waiting
c25122d795a7: Waiting
1a3751a137aa: Waiting
0a41459588e0: Waiting
8dd0f072504a: Waiting
311870697524: Waiting
18d6caee1d01: Waiting
cac2fff6ae3d: Waiting
a13c519c6361: Waiting
a037458de4e0: Waiting
bafdbe68e4ae: Waiting
fbfca3e71742: Pushed
3be8997d45f1: Pushed
262d0f42e7a3: Pushed
bdc0bb3c52a3: Pushed
9c66c5e8b97d: Pushed
eb8c38a950c2: Pushed
240fa7a476c7: Pushed
e87a173ce27c: Pushed
18d6caee1d01: Pushed
1a3751a137aa: Pushed
311870697524: Pushed
0a41459588e0: Layer already exists
30e908a38a18: Layer already exists
cac2fff6ae3d: Layer already exists
79a90c6af3e1: Pushed
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
c25122d795a7: Pushed
4226c6d08a2b: Pushed
8dd0f072504a: Pushed
20220427124326: digest: sha256:0d0fc300f7dadefdb57dabefa679d1fd305588a59ca4b0cea40a3145515e2ae9 size: 4935

> Task :sdks:java:testing:load-tests:run
Apr 27, 2022 12:45:14 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Apr 27, 2022 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 208 files. Enable logging at DEBUG level to see which files will be staged.
Apr 27, 2022 12:45:15 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Apr 27, 2022 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Apr 27, 2022 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 208 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Apr 27, 2022 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 208 files cached, 0 files newly uploaded in 0 seconds
Apr 27, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Apr 27, 2022 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <115405 bytes, hash b7f84c916467d72355af79498d272bd6def3be163ada9355e68321354c49cddc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-t_hMkWRn1yNVr3lJjScr1t7zvhY62pNV5oMhNUxJzdw.pb
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Apr 27, 2022 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57272109, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59696551, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@648d0e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79e66b2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17273273, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f69e2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@984169e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f1bb92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a818392, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@489091bd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512d6e60, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1de9b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b122839]
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Apr 27, 2022 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a6f6c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c5ddccd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dbd580, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c101cc1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d0d91a1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb48179, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@201c3cda, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c86da0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d97caa4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6732726, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@474821de, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d64c581, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ec5ea63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4190bc8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47d023b7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c83ae01, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64c100, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d45cca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fdf17dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e6d4780]
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Apr 27, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.39.0-SNAPSHOT
Apr 27, 2022 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-27_05_45_22-8540056087403892708?project=apache-beam-testing
Apr 27, 2022 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-04-27_05_45_22-8540056087403892708
Apr 27, 2022 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-04-27_05_45_22-8540056087403892708
Apr 27, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-04-27T12:45:29.163Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-04-n383. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Apr 27, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:35.483Z: Worker configuration: e2-standard-2 in us-central1-f.
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.417Z: Expanding SplittableParDo operations into optimizable parts.
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.443Z: Expanding CollectionToSingleton operations into optimizable parts.
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.512Z: Expanding CoGroupByKey operations into optimizable parts.
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.580Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.612Z: Expanding GroupByKey operations into streaming Read/Write steps
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.668Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.800Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.838Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.863Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.897Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.932Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:36.967Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.001Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.034Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.078Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.113Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.145Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.173Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.195Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.222Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.257Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.293Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.325Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.385Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.412Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.443Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.468Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.512Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Apr 27, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.536Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Apr 27, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.724Z: Running job using Streaming Engine
Apr 27, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:37.952Z: Starting 5 ****s in us-central1-f...
Apr 27, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:45:56.990Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Apr 27, 2022 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:46:13.114Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Apr 27, 2022 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T12:47:18.078Z: Workers have started successfully.
Apr 27, 2022 4:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T16:00:57.500Z: Cancel request is committed for workflow job: 2022-04-27_05_45_22-8540056087403892708.
Apr 27, 2022 4:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T16:00:58.148Z: Cleaning up.
Apr 27, 2022 4:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T16:00:58.198Z: Stopping **** pool...
Apr 27, 2022 4:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T16:00:58.245Z: Stopping **** pool...
Apr 27, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T16:01:29.837Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 27, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-27T16:01:29.878Z: Worker pool stopped.
Apr 27, 2022 4:01:35 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-27_05_45_22-8540056087403892708 finished with status CANCELLED.
Load test results for test (ID): 563007d3-2174-4920-9b19-ff90f7af5eb8 and timestamp: 2022-04-27T12:45:15.149000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11594.064
dataflow_v2_java11_total_bytes_count             2.29532811E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220427124326
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0d0fc300f7dadefdb57dabefa679d1fd305588a59ca4b0cea40a3145515e2ae9
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220427124326]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0d0fc300f7dadefdb57dabefa679d1fd305588a59ca4b0cea40a3145515e2ae9]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220427124326] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0d0fc300f7dadefdb57dabefa679d1fd305588a59ca4b0cea40a3145515e2ae9])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0d0fc300f7dadefdb57dabefa679d1fd305588a59ca4b0cea40a3145515e2ae9
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0d0fc300f7dadefdb57dabefa679d1fd305588a59ca4b0cea40a3145515e2ae9
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0d0fc300f7dadefdb57dabefa679d1fd305588a59ca4b0cea40a3145515e2ae9].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 19s
109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wkki453a4bsfc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #311

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/311/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14343] Allow expansion service override in ExternalPythonTransform

[Heejong Lee] update

[Heejong Lee] allows remote host

[Heejong Lee] improve compatibility with python rowcoder

[ahmedabualsaud] added tempLocation to test pipeline options

[ahmedabualsaud] using tempRoot for temp bucket location

[ahmedabualsaud] small fixes

[noreply] [BEAM-14320] Update programming-guide w/Java GroupByKey example (#17369)

[noreply] Minor: Fix release script for `current` symlinks (#17457)

[noreply] Minor: fix typo (#17452)

[noreply] Change return type for PytorchInferenceRunner (#17460)

[noreply] [BEAM-13608] JmsIO dynamic topics feature (#17163)

[Heejong Lee] add test


------------------------------------------
[...truncated 556.58 KB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn
Apr 26, 2022 4:01:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-26T16:01:07.291Z: Cancel request is committed for workflow job: 2022-04-26_05_45_29-4722706681521895786.
Apr 26, 2022 4:01:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-26T16:01:12.267Z: Cleaning up.
Apr 26, 2022 4:01:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-26T16:01:12.361Z: Stopping **** pool...
Apr 26, 2022 4:01:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-26T16:01:12.425Z: Stopping **** pool...
Apr 26, 2022 4:01:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-26T16:01:45.069Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 26, 2022 4:01:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-26T16:01:45.119Z: Worker pool stopped.
Apr 26, 2022 4:01:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-26_05_45_29-4722706681521895786 finished with status CANCELLED.
Load test results for test (ID): ef92b33c-2574-4408-a493-6db438299e6d and timestamp: 2022-04-26T12:45:23.427000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11597.075
dataflow_v2_java11_total_bytes_count             3.17860513E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220426124327
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:298f2ba3e6a45e04cc04e566b78633b5b5f64034d879cbd930fc139971ead11b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220426124327]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:298f2ba3e6a45e04cc04e566b78633b5b5f64034d879cbd930fc139971ead11b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220426124327] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:298f2ba3e6a45e04cc04e566b78633b5b5f64034d879cbd930fc139971ead11b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:298f2ba3e6a45e04cc04e566b78633b5b5f64034d879cbd930fc139971ead11b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:298f2ba3e6a45e04cc04e566b78633b5b5f64034d879cbd930fc139971ead11b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:298f2ba3e6a45e04cc04e566b78633b5b5f64034d879cbd930fc139971ead11b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 38s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/jxk6a4ajnzjak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #310

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/310/display/redirect>

Changes:


------------------------------------------
[...truncated 77.12 KB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 25, 2022 12:53:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-25T12:53:16.048Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 25, 2022 12:53:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-25T12:53:16.349Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 25, 2022 12:53:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-25T12:53:16.596Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 25, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-25T16:01:02.253Z: Cancel request is committed for workflow job: 2022-04-25_05_45_32-2761575958654551891.
Apr 25, 2022 4:01:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-25T16:01:06.913Z: Cleaning up.
Apr 25, 2022 4:01:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-25T16:01:07.024Z: Stopping **** pool...
Apr 25, 2022 4:01:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-25T16:01:07.066Z: Stopping **** pool...
Apr 25, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-25T16:01:40.445Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 25, 2022 4:01:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-25T16:01:40.483Z: Worker pool stopped.
Apr 25, 2022 4:01:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-25_05_45_32-2761575958654551891 finished with status CANCELLED.
Load test results for test (ID): 8d12e3ff-9c87-4104-b6e6-24679e932002 and timestamp: 2022-04-25T12:45:25.102000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11584.661
dataflow_v2_java11_total_bytes_count             3.85764918E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220425124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f8b6659171be832f5b702ddc5cec21638042c41d6a2f3cfc168933655b6a8eab
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220425124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f8b6659171be832f5b702ddc5cec21638042c41d6a2f3cfc168933655b6a8eab]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220425124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f8b6659171be832f5b702ddc5cec21638042c41d6a2f3cfc168933655b6a8eab])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f8b6659171be832f5b702ddc5cec21638042c41d6a2f3cfc168933655b6a8eab
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f8b6659171be832f5b702ddc5cec21638042c41d6a2f3cfc168933655b6a8eab
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f8b6659171be832f5b702ddc5cec21638042c41d6a2f3cfc168933655b6a8eab].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 25s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/miaxz3vbn2yqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 309 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 309 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/309/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #308

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/308/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-14321] SQL passes Null for Null aggregates

[noreply] Create apache-hop-with-dataflow.md

[noreply] Add files via upload

[noreply] Delete website/www/site/content/en/blog/apache-hop-with-dataflow

[noreply] Add files via upload

[Andrew Pilloud] [BEAM-14348] Upgrade to ZetaSQL 2022.04.1

[Andrew Pilloud] [BEAM-13735] Enable ZetaSQL tests for Java 17

[noreply] Update apache-hop-with-dataflow.md

[noreply] Update apache-hop-with-dataflow.md

[noreply] Update apache-hop-with-dataflow.md

[danielamartinmtz] Moved up get-credentials instruction for getting the kubeconfig file

[noreply] Merge pull request #17438: [BEAM-8127] The GCP module to declare

[noreply] Merge pull request #17428: [BEAM-14326] Make sure BigQuery daemon thread

[noreply] [BEAM-14301] Add lint:ignore to noescape() func (#17355)

[noreply] [BEAM-14286] Remove unused vars in harness package (#17392)

[noreply] [BEAM-14327] Convert Results to QueryResults directly (#17398)

[noreply] [BEAM-14302] Simplify boolean check in fn.go (#17399)

[noreply] [BEAM-13983] Sklearn Loader for RunInference (#17368)

[noreply] Update authors.yml

[noreply] [BEAM-14358] add retry to connect to testcontainer (#17449)

[noreply] [BEAM-13106] Bump flink docs to 1.14 (#17430)


------------------------------------------
[...truncated 741.39 KB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. Thi
Apr 23, 2022 4:00:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-23T16:00:58.322Z: Cancel request is committed for workflow job: 2022-04-23_05_45_56-1929918636908947708.
Apr 23, 2022 4:00:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-23T16:00:58.438Z: Cleaning up.
Apr 23, 2022 4:00:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-23T16:00:58.570Z: Stopping **** pool...
Apr 23, 2022 4:00:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-23T16:00:58.634Z: Stopping **** pool...
Apr 23, 2022 4:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-23T16:01:33.810Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 23, 2022 4:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-23T16:01:33.845Z: Worker pool stopped.
Apr 23, 2022 4:01:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-23_05_45_56-1929918636908947708 finished with status CANCELLED.
Load test results for test (ID): 37b7d6fd-4532-42d4-98a5-bbc9c34abfbc and timestamp: 2022-04-23T12:45:45.705000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11523.203
dataflow_v2_java11_total_bytes_count             2.20852609E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220423124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37f6842e595143aa8d4e68fd6910f9f7ca0ef3064852504db6370faad9eb4af5
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220423124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37f6842e595143aa8d4e68fd6910f9f7ca0ef3064852504db6370faad9eb4af5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220423124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37f6842e595143aa8d4e68fd6910f9f7ca0ef3064852504db6370faad9eb4af5])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37f6842e595143aa8d4e68fd6910f9f7ca0ef3064852504db6370faad9eb4af5
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37f6842e595143aa8d4e68fd6910f9f7ca0ef3064852504db6370faad9eb4af5
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37f6842e595143aa8d4e68fd6910f9f7ca0ef3064852504db6370faad9eb4af5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 24s
109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/pz4xhzphooedy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #307

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/307/display/redirect?page=changes>

Changes:

[mmack] [BEAM-14335] Spotless Spark sources

[mmack] [BEAM-14345] Force paranamer 2.8 for Spark Hadoop version tests to avoid

[kamil.bregula] Revert "[BEAM-14300] Fix Java precommit failure"

[kamil.bregula] Revert "Merge pull request #17223 from [BEAM-14215] Improve argument

[noreply] [BEAM-13657] Sunset python 3.6 (#17252)

[noreply] Removes unsupported Python 3.6 from the release validation script

[noreply] [BEAM-13984] Implement RunInference for PyTorch (#17196)

[noreply] [BEAM-13945] add json type support for java bigquery connector (#17209)

[noreply] [BEAM-14346] Fix incorrect error case index in ret2() (#17425)

[noreply] [BEAM-14342] Fix wrong default buffer type in fn_runner (#17420)

[noreply] Updates opencensus-api dependency to the latest version - 0.31.0

[noreply] [BEAM-14306] Add unit testing to pane coder (#17370)

[noreply] Updated the dep and golden for screen diff integration tests (#17442)

[noreply] [BEAM-13657] Add python 3.6 update to CHANGES.md (#17435)


------------------------------------------
[...truncated 138.70 KB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 22, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-22T16:00:45.734Z: Cancel request is committed for workflow job: 2022-04-22_05_45_31-18196702156697109876.
Apr 22, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-22T16:00:45.788Z: Cleaning up.
Apr 22, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-22T16:00:45.848Z: Stopping **** pool...
Apr 22, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-22T16:00:45.900Z: Stopping **** pool...
Apr 22, 2022 4:01:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-22T16:01:19.989Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 22, 2022 4:01:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-22T16:01:20.036Z: Worker pool stopped.
Apr 22, 2022 4:01:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-22_05_45_31-18196702156697109876 finished with status CANCELLED.
Load test results for test (ID): d0d1ae6d-7e55-48a0-9a66-1f9209084957 and timestamp: 2022-04-22T12:45:25.075000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11513.218
dataflow_v2_java11_total_bytes_count             3.30933291E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220422124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dde5d8755a66939925e14c3b0895234d781209a20db326aa7b641f0db9a7943a
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220422124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dde5d8755a66939925e14c3b0895234d781209a20db326aa7b641f0db9a7943a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220422124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dde5d8755a66939925e14c3b0895234d781209a20db326aa7b641f0db9a7943a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dde5d8755a66939925e14c3b0895234d781209a20db326aa7b641f0db9a7943a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dde5d8755a66939925e14c3b0895234d781209a20db326aa7b641f0db9a7943a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dde5d8755a66939925e14c3b0895234d781209a20db326aa7b641f0db9a7943a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 6s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/nxbjx4vtzi26y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #306

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/306/display/redirect?page=changes>

Changes:

[vachan] Annotating Read API tests.

[bulat.safiullin] [BEAM-14247] [Website] add image

[bulat.safiullin] [BEAM-14247] [Website] center image

[mattcasters] BEAM-1857 : CHANGES.md entry for 2.38.0

[mmack] [BEAM-14323] Improve IDE integration of Spark cross version builds

[noreply] [BEAM-14112] Fixed ReadFromBigQuery with Interactive Beam (#17306)

[noreply] Update .asf.yaml (#17409)

[noreply] [BEAM-14336] Sickbay flight delays test - dataset seems to be missing

[noreply] [BEAM-14338] Update watermark unit tests to use time.Time.Equals()

[noreply] [BEAM-14328] Tweaks to "Differences from pandas" page (#17413)

[Andrew Pilloud] [BEAM-14253] Disable broken test pending Dataflow fix

[yiru] fix: BigQuery Storage Connector trace id population missing bracket

[noreply] [BEAM-14330] Temporarily disable the clusters auto-cleanup (#17400)

[noreply] Update Beam website to release 2.38.0 (#17378)

[noreply] [BEAM-14213] Add API and construction time validation for Batched DoFns

[noreply] Minor: Update release guide regarding archive.apache.org (#17419)

[noreply] [BEAM-14017] beam_PreCommit_CommunityMetrics_Cron test failing (#17396)

[noreply] BEAM-13582 Fixing broken links in the documentation (#17300)


------------------------------------------
[...truncated 1.36 MB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Apr 21, 2022 2:22:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-21T14:22:02.029Z: Staged package beam-runners-java-fn-execution-2.39.0-SNAPSHOT-B5zxJJ5wEeq7LSr_PWVtwrsAU-Ea4rE2CwHw00_o9d0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.39.0-SNAPSHOT-B5zxJJ5wEeq7LSr_PWVtwrsAU-Ea4rE2CwHw00_o9d0.jar' is inaccessible.
Apr 21, 2022 2:22:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-21T14:22:02.131Z: Staged package beam-sdks-java-extensions-arrow-2.39.0-SNAPSHOT--6o0gUFJinkN4upVAkTzRdayVoYWlnjhjcsg2E3us1A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-arrow-2.39.0-SNAPSHOT--6o0gUFJinkN4upVAkTzRdayVoYWlnjhjcsg2E3us1A.jar' is inaccessible.
Apr 21, 2022 2:22:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-21T14:22:02.179Z: Staged package beam-sdks-java-extensions-google-cloud-platform-core-2.39.0-SNAPSHOT-34gk8NosZjW1cvWVuOwtZQD2UIGdR-qRbF_odnGiUSU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.39.0-SNAPSHOT-34gk8NosZjW1cvWVuOwtZQD2UIGdR-qRbF_odnGiUSU.jar' is inaccessible.
Apr 21, 2022 2:22:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-21T14:22:02.230Z: Staged package beam-sdks-java-extensions-protobuf-2.39.0-SNAPSHOT-IZH8Wbx3TFivq164axlD8yUXQ84r7Lu3HmCKcNTUkOY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.39.0-SNAPSHOT-IZH8Wbx3TFivq164axlD8yUXQ84r7Lu3HmCKcNTUkOY.jar' is inaccessible.
Apr 21, 2022 2:22:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-21T14:22:02.368Z: Staged package beam-sdks-java-io-synthetic-2.39.0-SNAPSHOT-jarS5mJehhs9eb-NZbUKGKDXV420CBUVy0a4JE4KVs4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-synthetic-2.39.0-SNAPSHOT-jarS5mJehhs9eb-NZbUKGKDXV420CBUVy0a4JE4KVs4.jar' is inaccessible.
Apr 21, 2022 2:22:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-21T14:22:02.436Z: Staged package beam-sdks-java-test-utils-2.39.0-SNAPSHOT-uDXAbi02WU3WDaNGqETzLc5tbQsW_968W1HOImTwz-s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.39.0-SNAPSHOT-uDXAbi02WU3WDaNGqETzLc5tbQsW_968W1HOImTwz-s.jar' is inaccessible.
Apr 21, 2022 2:22:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-04-21T14:22:05.829Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Apr 21, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-21T16:00:44.504Z: Cancel request is committed for workflow job: 2022-04-21_05_45_40-13538467564060971928.
Apr 21, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-21T16:00:44.578Z: Cleaning up.
Apr 21, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-21T16:00:44.687Z: Stopping **** pool...
Apr 21, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-21T16:00:44.729Z: Stopping **** pool...
Apr 21, 2022 4:01:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-21T16:01:21.156Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 21, 2022 4:01:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-21T16:01:21.198Z: Worker pool stopped.
Apr 21, 2022 4:01:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-21_05_45_40-13538467564060971928 finished with status CANCELLED.
Load test results for test (ID): f745bf34-932e-424f-8e43-a7130260d7f1 and timestamp: 2022-04-21T12:45:31.743000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11563.409
dataflow_v2_java11_total_bytes_count             3.34455127E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220421124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3b7f087ce8e62b57565ab4947f707956996a065a79b0d6a17afc6796b83b62f5
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220421124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3b7f087ce8e62b57565ab4947f707956996a065a79b0d6a17afc6796b83b62f5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220421124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3b7f087ce8e62b57565ab4947f707956996a065a79b0d6a17afc6796b83b62f5])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3b7f087ce8e62b57565ab4947f707956996a065a79b0d6a17afc6796b83b62f5
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3b7f087ce8e62b57565ab4947f707956996a065a79b0d6a17afc6796b83b62f5
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3b7f087ce8e62b57565ab4947f707956996a065a79b0d6a17afc6796b83b62f5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 14s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wkgdcgn4krkik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #305

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/305/display/redirect?page=changes>

Changes:

[andyye333] Change func to PTransform

[noreply] Populate actual dataflow job id to bigquery write trace id (#17130)

[relax] mark static thread as a daemon thread

[noreply] [BEAM-13866] Add miscellaneous exec unit tests (#17363)


------------------------------------------
[...truncated 163.99 KB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:2
Apr 20, 2022 12:53:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-04-20T12:53:00.997Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 20, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-20T16:00:34.390Z: Cancel request is committed for workflow job: 2022-04-20_05_45_54-16325466336428050495.
Apr 20, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-20T16:00:34.498Z: Cleaning up.
Apr 20, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-20T16:00:34.598Z: Stopping **** pool...
Apr 20, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-20T16:00:34.658Z: Stopping **** pool...
Apr 20, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-20T16:01:30.048Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 20, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-20T16:01:30.081Z: Worker pool stopped.
Apr 20, 2022 4:01:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-20_05_45_54-16325466336428050495 finished with status CANCELLED.
Load test results for test (ID): a5df4288-8990-4bd3-812f-39b8accadd7a and timestamp: 2022-04-20T12:45:47.183000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11534.559
dataflow_v2_java11_total_bytes_count             3.37011862E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220420124340
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c97597e4fdb6d9a9423e194d73ab41a599ea27988f6e47df849663dfa0125ee6
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220420124340]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c97597e4fdb6d9a9423e194d73ab41a599ea27988f6e47df849663dfa0125ee6]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220420124340] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c97597e4fdb6d9a9423e194d73ab41a599ea27988f6e47df849663dfa0125ee6])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c97597e4fdb6d9a9423e194d73ab41a599ea27988f6e47df849663dfa0125ee6
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c97597e4fdb6d9a9423e194d73ab41a599ea27988f6e47df849663dfa0125ee6
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c97597e4fdb6d9a9423e194d73ab41a599ea27988f6e47df849663dfa0125ee6].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 18s
109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/nz2uvwxznclpk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #304

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/304/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14251] add output_coder_override to ExpansionRequest

[Heejong Lee] remove null

[rarokni] [BEAM-14307] Fix Slow Side input pattern bug in sample

[Heejong Lee] better error msg

[Heejong Lee] update from comments

[noreply] [BEAM-14316] Introducing KafkaIO.Read implementation compatibility

[noreply] [BEAM-14290] Address staticcheck warnings in the reflectx package

[noreply] [BEAM-14302] Simply bools in fn.go, genx_test.go (#17356)

[noreply] Merge pull request #17382: [BEAM-12356] Close DatasetService leak as


------------------------------------------
[...truncated 261.92 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Apr 19, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-19T16:00:49.680Z: Cancel request is committed for workflow job: 2022-04-19_05_45_55-535538171355708343.
Apr 19, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-19T16:00:49.779Z: Cleaning up.
Apr 19, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-19T16:00:49.854Z: Stopping **** pool...
Apr 19, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-19T16:00:49.960Z: Stopping **** pool...
Apr 19, 2022 4:01:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-19T16:01:43.642Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 19, 2022 4:01:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-19T16:01:43.678Z: Worker pool stopped.
Apr 19, 2022 4:01:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-19_05_45_55-535538171355708343 finished with status CANCELLED.
Load test results for test (ID): 3f5fa757-4912-42a7-b8ba-973d4b05fbf1 and timestamp: 2022-04-19T12:45:48.897000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11560.823
dataflow_v2_java11_total_bytes_count             3.46050983E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220419124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8dcc3fa75372b81381d851a17464a4ac59e3658bd4c4b1d5962a51dc35bd956
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220419124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8dcc3fa75372b81381d851a17464a4ac59e3658bd4c4b1d5962a51dc35bd956]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220419124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8dcc3fa75372b81381d851a17464a4ac59e3658bd4c4b1d5962a51dc35bd956])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8dcc3fa75372b81381d851a17464a4ac59e3658bd4c4b1d5962a51dc35bd956
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8dcc3fa75372b81381d851a17464a4ac59e3658bd4c4b1d5962a51dc35bd956
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c8dcc3fa75372b81381d851a17464a4ac59e3658bd4c4b1d5962a51dc35bd956].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 34s
109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4ibjpd67k4equ

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #303

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/303/display/redirect>

Changes:


------------------------------------------
[...truncated 655.28 KB...]
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
g
Apr 18, 2022 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-18T16:00:30.662Z: Cancel request is committed for workflow job: 2022-04-18_05_45_47-13209903418256996392.
Apr 18, 2022 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-18T16:00:30.741Z: Cleaning up.
Apr 18, 2022 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-18T16:00:30.911Z: Stopping **** pool...
Apr 18, 2022 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-18T16:00:30.968Z: Stopping **** pool...
Apr 18, 2022 4:01:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-18T16:01:19.329Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 18, 2022 4:01:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-18T16:01:19.369Z: Worker pool stopped.
Apr 18, 2022 4:01:26 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-18_05_45_47-13209903418256996392 finished with status CANCELLED.
Load test results for test (ID): 189b664a-726c-459c-a133-958fa5a0169d and timestamp: 2022-04-18T12:45:41.042000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11469.352
dataflow_v2_java11_total_bytes_count             2.36511354E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220418124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:89ab6f052c40cb360051626c4f6cdfde7e85898840885ada979e72fb9c98e6c2
Deleted: sha256:1ec5f703b9ac8e4a167c391c956c769638a8eedcd75558b9031a714d7391182e
Deleted: sha256:502dd26e4d88151e44ce34d6b136e0f8fd6170319f0c6d385717d8cf1cbe3998
Deleted: sha256:4c25c9953c1cddb6694af0aad5eb6d30c91e89bed3a5a9f80a146ad032e7e57a
Deleted: sha256:ea8130b99716a67314db7a31a7a4c595cee3ee32e5b829a33b0c77b302d32f34
Deleted: sha256:1376a80356289e2aacd6cb3940f940b80b518a9779ea524dc8933ac57ff1677c
Deleted: sha256:699a16d3e2b4f50e7ad33994b21254b6f5324618b6968f6d3f078a0497004d0b
Deleted: sha256:7c284cfbf4f3516e73a07530700f8628157464d72bd85a05822ff01a8bbeb035
Deleted: sha256:91052244f2314234e7e9f5fa832ec0c3edf999a3b7e2fe11d624e153bcbee939
Deleted: sha256:040167bdb1764bed8053a33a3169e137b235f7912cf44cfdd771ed5de81a855a
Deleted: sha256:1f48e4ef9fbe743fe438cc12e6441bbe8e2fcbc53fd67d911a132a88b83efc91
Deleted: sha256:78c0a69cfd2247003641f6cb0b3b5552c9982ef38b5bb30d9b50dea5e119c224
Deleted: sha256:796a4cf734f0dde17dee4380b00114cf0f1100e3522f87eba9fe8baf5f1563d7
Deleted: sha256:6353684dd55b322e3d6dd4bbfa294a04683bd370650483c919c07ec9f707ec84
Deleted: sha256:2b70b843724b3d11a9800c430ac7177d3c3579444eef50a16499768a90f5cdd2
Deleted: sha256:b8e633a12f1e8b25a75556cac683027b068723b425e6120e83f657469421fb8b
Deleted: sha256:c605c1e695d857e10b57a06cd9ebb040ca81291ae7416dd3e95e408c7aa0eb23
Deleted: sha256:6a103cca2c22726d89d90d2628a725aaf3568403d7be4d6450f69549c8df8a65
Deleted: sha256:64dab1490ac2e4480044748168b5c88085d3be31f11b3238644629c1dd397428
Deleted: sha256:1790877ef888d1aabb27475eb4a573b311a141b20845e5c887f444a5d572054b
Deleted: sha256:58cd360e05a1181ada37ee6321e66a1b38e915fd72aee7d7fe7dceffd4223614
Deleted: sha256:ed7bf16eecc47eb13f94e6ba381c07d173c5d1a4c45dc53c39f9e15a7b11a268
Deleted: sha256:2ea09870d1b1414688072fda804ad6d0d953d9a81602a386248bb30a98dee992
Deleted: sha256:287d89da71f59fbcbf706a2473f46e65a7728160eead7c2555771227092c839d
Deleted: sha256:ebebd746445cba24203ded1032d3a6eff9b260808a871b1c5ab218489c8496cd
Deleted: sha256:707c1c2a741175934b01e73104d9de40b824c22c70a989cb31b749a409d340f2
Deleted: sha256:49157ee4f6879645ca5b31f10a2d8ef84d66a8ec0074bc2f036cd83b43eaf199
Deleted: sha256:6b68e09e0f356fbf63c6b3ea4cccec04fa2ee6c8d4e4d78e762b79b7d11ce8db
Deleted: sha256:cbe769884ea3dfcd51b46eb53d1a4b6f255268e1328c65b9049cced85b017c2f
Deleted: sha256:d0db2f82df0cedcb9f95f8642f88f27816bf24bc68ed4e422d3e97b982bc5da8
Deleted: sha256:8020a6c237de1aeac82a74f515936f01eca6b27c0740e9be94839d7ff54b1851
Deleted: sha256:cc51c68be2011fef510a08b8c63ac84717746668c5c74ebc2eba6496b7c78e14
Deleted: sha256:f459ac7496996a031670a5f7ea65fa8c6478cb9538abee245e205accf5c5a75b
Deleted: sha256:1a3c7a06f95e8186591d5cba251ab0d04b69826601d01e84bc5970306a05389c
Deleted: sha256:886a13d1e1e61938b7c93608a4d4347ae963129e6ce34974563dcf97bfbe9898
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220418124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:89ab6f052c40cb360051626c4f6cdfde7e85898840885ada979e72fb9c98e6c2]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220418124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:89ab6f052c40cb360051626c4f6cdfde7e85898840885ada979e72fb9c98e6c2])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:89ab6f052c40cb360051626c4f6cdfde7e85898840885ada979e72fb9c98e6c2
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:89ab6f052c40cb360051626c4f6cdfde7e85898840885ada979e72fb9c98e6c2
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:89ab6f052c40cb360051626c4f6cdfde7e85898840885ada979e72fb9c98e6c2].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 10s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/x6pw5rmumqdi2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #302

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/302/display/redirect>

Changes:


------------------------------------------
[...truncated 754.30 KB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during au
Apr 17, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-17T16:00:43.392Z: Cancel request is committed for workflow job: 2022-04-17_05_45_40-1893110379336830713.
Apr 17, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-17T16:00:43.460Z: Cleaning up.
Apr 17, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-17T16:00:43.569Z: Stopping **** pool...
Apr 17, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-17T16:00:43.614Z: Stopping **** pool...
Apr 17, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-17T16:01:29.970Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 17, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-17T16:01:30.012Z: Worker pool stopped.
Apr 17, 2022 4:01:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-17_05_45_40-1893110379336830713 finished with status CANCELLED.
Load test results for test (ID): 01af813c-24b3-4562-a236-71dd9882bd10 and timestamp: 2022-04-17T12:45:32.309000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11540.157
dataflow_v2_java11_total_bytes_count             3.50469579E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220417124340
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:321c3e5823ec3f23df07b8846b809c250fe4da35d07e59185eec1c7d3fd77993
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220417124340]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:321c3e5823ec3f23df07b8846b809c250fe4da35d07e59185eec1c7d3fd77993]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220417124340] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:321c3e5823ec3f23df07b8846b809c250fe4da35d07e59185eec1c7d3fd77993])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:321c3e5823ec3f23df07b8846b809c250fe4da35d07e59185eec1c7d3fd77993
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:321c3e5823ec3f23df07b8846b809c250fe4da35d07e59185eec1c7d3fd77993
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:321c3e5823ec3f23df07b8846b809c250fe4da35d07e59185eec1c7d3fd77993].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 16s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4rzqtbgipje5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #301

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/301/display/redirect?page=changes>

Changes:

[pandiana] BigQueryServicesImpl: reduce number of threads spawned by

[noreply] [BEAM-13204] Fix website bug where code tabs do not appear if the


------------------------------------------
[...truncated 869.47 KB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 16, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-16T16:00:42.988Z: Cancel request is committed for workflow job: 2022-04-16_05_45_53-13457546018423208057.
Apr 16, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-16T16:00:43.024Z: Cleaning up.
Apr 16, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-16T16:00:43.102Z: Stopping **** pool...
Apr 16, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-16T16:00:43.155Z: Stopping **** pool...
Apr 16, 2022 4:01:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-16T16:01:33.948Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 16, 2022 4:01:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-16T16:01:33.982Z: Worker pool stopped.
Apr 16, 2022 4:01:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-16_05_45_53-13457546018423208057 finished with status CANCELLED.
Load test results for test (ID): 93020d73-185f-4a44-9245-d20f2b8abf04 and timestamp: 2022-04-16T12:45:42.142000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11523.842
dataflow_v2_java11_total_bytes_count             2.31982371E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220416124340
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e02cb7b61e8e31263720f99a64d52cf0c931929ac62d09c3d32d598ab91872e0
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220416124340]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e02cb7b61e8e31263720f99a64d52cf0c931929ac62d09c3d32d598ab91872e0]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220416124340] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e02cb7b61e8e31263720f99a64d52cf0c931929ac62d09c3d32d598ab91872e0])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e02cb7b61e8e31263720f99a64d52cf0c931929ac62d09c3d32d598ab91872e0
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e02cb7b61e8e31263720f99a64d52cf0c931929ac62d09c3d32d598ab91872e0
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e02cb7b61e8e31263720f99a64d52cf0c931929ac62d09c3d32d598ab91872e0].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 20s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/sihbrpstga5p2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #300

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/300/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Upgrade to Gradle 7.4

[Kenneth Knowles] Remove Python module dependency on Dataflow worker

[noreply] [BEAM-13925] Dont double assign committers if author or other reviewer

[noreply] [BEAM-13739] Remove deprecated shallow clone funcs (#17362)

[noreply] [BEAM-11104] Pipe Continuation to DataSource level (#17334)

[noreply] [BEAM-11105] Basic Watermark Estimation (Wall Clock Observing) (#17267)

[noreply] Respect output coder for TextIO. (#17367)

[noreply] Merge pull request #17200 from [BEAM-12164]: fix the autoscaling backlog

[noreply] [BEAM-17035] Call python3 directly when it is available. (#17366)

[noreply] Merge pull request #17375: [BEAM-8691] Declare newer


------------------------------------------
[...truncated 655.71 KB...]
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/s
Apr 15, 2022 4:01:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-15T16:01:07.045Z: Cancel request is committed for workflow job: 2022-04-15_05_45_37-15443102269930690646.
Apr 15, 2022 4:01:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-15T16:01:07.147Z: Cleaning up.
Apr 15, 2022 4:01:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-15T16:01:07.243Z: Stopping **** pool...
Apr 15, 2022 4:01:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-15T16:01:07.291Z: Stopping **** pool...
Apr 15, 2022 4:02:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-15T16:01:58.779Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 15, 2022 4:02:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-15T16:01:58.823Z: Worker pool stopped.
Apr 15, 2022 4:02:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-15_05_45_37-15443102269930690646 finished with status CANCELLED.
Load test results for test (ID): e11b7a60-422c-460e-9560-6f77b6ded17d and timestamp: 2022-04-15T12:45:28.920000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11603.514
dataflow_v2_java11_total_bytes_count             3.43644528E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220415124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:837acb140e871f602a1bfdea8e8c28ed0b1e9a458ab48d2d986ff464eca7dc12
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220415124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:837acb140e871f602a1bfdea8e8c28ed0b1e9a458ab48d2d986ff464eca7dc12]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220415124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:837acb140e871f602a1bfdea8e8c28ed0b1e9a458ab48d2d986ff464eca7dc12])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:837acb140e871f602a1bfdea8e8c28ed0b1e9a458ab48d2d986ff464eca7dc12
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:837acb140e871f602a1bfdea8e8c28ed0b1e9a458ab48d2d986ff464eca7dc12
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Fri, 15 Apr 2022 16:02:15 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:837acb140e871f602a1bfdea8e8c28ed0b1e9a458ab48d2d986ff464eca7dc12': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 54s
109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/abctlamyx2boc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #299

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/299/display/redirect?page=changes>

Changes:

[relax] handle changing schemas in Storage API sink

[noreply] Fix a couple style issues (#17361)

[noreply] [BEAM-14287] Clean up staticcheck warnings in graph/coder (#17337)

[noreply] Improvements to dataflow job service for non-Python jobs. (#17338)

[noreply] Bump minimist (#17290)

[noreply] Bump ansi-regex (#17291)

[noreply] Bump nanoid (#17292)

[noreply] Bump lodash (#17293)

[noreply] Bump url-parse (#17294)

[noreply] Bump moment (#17328)

[noreply] Merge pull request #15549 from [BEAM-11997] Changed RedisIO


------------------------------------------
[...truncated 654.99 KB...]
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autosca
Apr 14, 2022 4:01:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-14T16:01:12.588Z: Cancel request is committed for workflow job: 2022-04-14_05_45_54-2231122832314727981.
Apr 14, 2022 4:01:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-14T16:01:18.246Z: Cleaning up.
Apr 14, 2022 4:01:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-14T16:01:18.364Z: Stopping **** pool...
Apr 14, 2022 4:01:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-14T16:01:18.427Z: Stopping **** pool...
Apr 14, 2022 4:02:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-14T16:02:18.554Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 14, 2022 4:02:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-14T16:02:18.874Z: Worker pool stopped.
Apr 14, 2022 4:02:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-14_05_45_54-2231122832314727981 finished with status CANCELLED.
Load test results for test (ID): f401f197-2a45-499f-80a4-77bbc693edc5 and timestamp: 2022-04-14T12:45:45.279000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11538.034
dataflow_v2_java11_total_bytes_count             3.41144067E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220414124351
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c7ff4e78f8ffd93d4b430e072ef74a7fcb05e3a16ad0655872b0d2024308f9bd
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220414124351]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c7ff4e78f8ffd93d4b430e072ef74a7fcb05e3a16ad0655872b0d2024308f9bd]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220414124351] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c7ff4e78f8ffd93d4b430e072ef74a7fcb05e3a16ad0655872b0d2024308f9bd])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c7ff4e78f8ffd93d4b430e072ef74a7fcb05e3a16ad0655872b0d2024308f9bd
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c7ff4e78f8ffd93d4b430e072ef74a7fcb05e3a16ad0655872b0d2024308f9bd
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c7ff4e78f8ffd93d4b430e072ef74a7fcb05e3a16ad0655872b0d2024308f9bd].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 58s
109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/73vooq5szskqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #298

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/298/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] Add remaining Dataflow test suites for Python 3.9.

[Heejong Lee] [BEAM-14232] Only resolve artifacts in expanded environments for Java

[noreply] Fix test ordering issue (#17350)

[buqian] Do not pass null to MoreObjects.firstNonNull as default value

[ningkang0957] [BEAM-14288] Fixed flaky test

[noreply] [BEAM-14277] Disables Spanner change streams tests (#17346)

[noreply] [BEAM-14219] Run cleanup script to remove stale prebuilt SDK container

[Heejong Lee] [BEAM-14300] Fix Java precommit failure

[noreply] [BEAM-14116] Rollback "Chunk commit requests dynamically (#17004)"

[noreply] [BEAM-13982] A base class for run inference (#16970)

[ningkang0957] Enumerates all possible expected strings when asserting

[noreply] [BEAM-13966] Add pivot(), a non-deferred column operation on categorical


------------------------------------------
[...truncated 555.28 KB...]
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_p
Apr 13, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-13T16:00:35.458Z: Cancel request is committed for workflow job: 2022-04-13_05_45_45-7529282448056351323.
Apr 13, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-13T16:00:35.556Z: Cleaning up.
Apr 13, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-13T16:00:35.636Z: Stopping **** pool...
Apr 13, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-13T16:00:35.683Z: Stopping **** pool...
Apr 13, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-13T16:01:28.433Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 13, 2022 4:01:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-13T16:01:28.480Z: Worker pool stopped.
Apr 13, 2022 4:01:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-13_05_45_45-7529282448056351323 finished with status CANCELLED.
Load test results for test (ID): ec3320c5-8274-4966-8dc3-a89ca2233096 and timestamp: 2022-04-13T12:45:38.674000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11537.592
dataflow_v2_java11_total_bytes_count             2.37041049E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220413124340
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:704ee023c9d62d7defeea410b35744157d61aca7db6c7b0b782f5aba5fef0044
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220413124340]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:704ee023c9d62d7defeea410b35744157d61aca7db6c7b0b782f5aba5fef0044]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220413124340] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:704ee023c9d62d7defeea410b35744157d61aca7db6c7b0b782f5aba5fef0044])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:704ee023c9d62d7defeea410b35744157d61aca7db6c7b0b782f5aba5fef0044
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:704ee023c9d62d7defeea410b35744157d61aca7db6c7b0b782f5aba5fef0044
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:704ee023c9d62d7defeea410b35744157d61aca7db6c7b0b782f5aba5fef0044].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 15s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cfh3haijantjy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #297

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/297/display/redirect?page=changes>

Changes:

[kamil.bregula] [BEAM-14215] Improve argument validation in SnowflakeIO

[benjamin.gonzalez] [BEAM-14013] Add PreCommit Kotlin examples Jenkins Job

[Andrew Pilloud] [BEAM-13151] Support multiple layers of AutoValue nesting

[Heejong Lee] [BEAM-14233] Merge requirements from expanded response for Java External

[benjamin.gonzalez] [BEAM-14013] Add spark, direct, flink runners as triggers for Kotlin

[noreply] [BEAM-13898] Add tests to the pubsubx package. (#17324)

[noreply] [BEAM-14285] Clean up Staticcheck Warnings in io packages (#17336)

[noreply] [BEAM-14187] Fix concurrency issue in IsmReaderImpl (#17201)

[noreply] [BEAM-14288] Skip flaking test

[noreply] Simplify specifying additional dependencies in Go SDK in XLang IOs

[noreply] [BEAM-14240] Clean staticcheck warnings in runner packages (#17340)

[Daniel Oliveira] [BEAM-13538] Workaround to fix go-licenses crash.


------------------------------------------
[...truncated 812.27 KB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Apr 12, 2022 3:23:13 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Apr 12, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-12T16:00:43.530Z: Cancel request is committed for workflow job: 2022-04-12_05_45_22-5020297268774329537.
Apr 12, 2022 4:00:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-12T16:00:48.261Z: Cleaning up.
Apr 12, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-12T16:00:48.345Z: Stopping **** pool...
Apr 12, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-12T16:00:48.396Z: Stopping **** pool...
Apr 12, 2022 4:01:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-12T16:01:41.907Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 12, 2022 4:01:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-12T16:01:41.956Z: Worker pool stopped.
Apr 12, 2022 4:01:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-12_05_45_22-5020297268774329537 finished with status CANCELLED.
Load test results for test (ID): b2324388-84fe-43bb-92f7-13aa47d6219b and timestamp: 2022-04-12T12:45:14.909000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11555.637
dataflow_v2_java11_total_bytes_count             3.05510772E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220412124327
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7614622c10543b7187b89c13c9b2ab535e6ced6f7cff33873e52c09ebcb7c043
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220412124327]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7614622c10543b7187b89c13c9b2ab535e6ced6f7cff33873e52c09ebcb7c043]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220412124327] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7614622c10543b7187b89c13c9b2ab535e6ced6f7cff33873e52c09ebcb7c043])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7614622c10543b7187b89c13c9b2ab535e6ced6f7cff33873e52c09ebcb7c043
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7614622c10543b7187b89c13c9b2ab535e6ced6f7cff33873e52c09ebcb7c043
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7614622c10543b7187b89c13c9b2ab535e6ced6f7cff33873e52c09ebcb7c043].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 31s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/s7mrplnjhakla

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #296

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/296/display/redirect>

Changes:


------------------------------------------
[...truncated 254.51 KB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Apr 11, 2022 4:01:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-11T16:01:45.612Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 11, 2022 4:01:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-11T16:01:45.689Z: Worker pool stopped.
Apr 11, 2022 4:01:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-11_05_45_36-12117919394498028756 finished with status CANCELLED.
Load test results for test (ID): 4a1bb991-4d39-4d60-bfcd-dbea64e26389 and timestamp: 2022-04-11T12:45:27.045000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11574.858
dataflow_v2_java11_total_bytes_count             2.80069345E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220411124329
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ac7d90f9f7391f6755f9aceeedf0fdb022411de6f9506506c752276aedbafed8
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220411124329]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ac7d90f9f7391f6755f9aceeedf0fdb022411de6f9506506c752276aedbafed8]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220411124329] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ac7d90f9f7391f6755f9aceeedf0fdb022411de6f9506506c752276aedbafed8])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ac7d90f9f7391f6755f9aceeedf0fdb022411de6f9506506c752276aedbafed8
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ac7d90f9f7391f6755f9aceeedf0fdb022411de6f9506506c752276aedbafed8
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ac7d90f9f7391f6755f9aceeedf0fdb022411de6f9506506c752276aedbafed8].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 36s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/gpzndns4535pg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #295

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/295/display/redirect?page=changes>

Changes:

[chamikaramj] Re-raise exceptions swallowed in several Python I/O connectors

[noreply] Merge pull request #16928: [BEAM-11971] Re add reverted timer


------------------------------------------
[...truncated 702.32 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Apr 10, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-10T16:00:47.657Z: Cancel request is committed for workflow job: 2022-04-10_05_45_43-13422413698553693979.
Apr 10, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-10T16:00:47.699Z: Cleaning up.
Apr 10, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-10T16:00:47.783Z: Stopping **** pool...
Apr 10, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-10T16:00:47.829Z: Stopping **** pool...
Apr 10, 2022 4:01:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-10T16:01:48.828Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 10, 2022 4:01:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-10T16:01:48.877Z: Worker pool stopped.
Apr 10, 2022 4:01:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-10_05_45_43-13422413698553693979 finished with status CANCELLED.
Load test results for test (ID): ca2d5976-ffb7-43b5-a039-246c361a60cb and timestamp: 2022-04-10T12:45:35.076000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11576.342
dataflow_v2_java11_total_bytes_count             2.50155426E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220410124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86c886d0f7a061f996b72aaa644914f96ff6049b0a6dddc4173b2f5e6d0b4c7f
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220410124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86c886d0f7a061f996b72aaa644914f96ff6049b0a6dddc4173b2f5e6d0b4c7f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220410124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86c886d0f7a061f996b72aaa644914f96ff6049b0a6dddc4173b2f5e6d0b4c7f])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86c886d0f7a061f996b72aaa644914f96ff6049b0a6dddc4173b2f5e6d0b4c7f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86c886d0f7a061f996b72aaa644914f96ff6049b0a6dddc4173b2f5e6d0b4c7f
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86c886d0f7a061f996b72aaa644914f96ff6049b0a6dddc4173b2f5e6d0b4c7f].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 41s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/5velzl35hywgi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #294

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/294/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-11714] Change spotBugs jenkins config

[Robert Bradshaw] Cleanup docs on Shared.

[Kyle Weaver] Nit: correct description for precommit cron jobs.

[benjamin.gonzalez] [BEAM-11714] Add dummy class for testing

[benjamin.gonzalez] [BEAM-11714] Remove dummy class used for testing

[benjamin.gonzalez] [BEAM-11714] Spotbugs print toJenkins UI precommit_Java17

[noreply] [BEAM-13767] Remove eclipse plugin as it generates a lot of unused tasks

[noreply] [BEAM-10708] Updated beam_sql error message (#17314)

[noreply] [BEAM-14281] add as_deterministic_coder to nullable coder (#17322)

[noreply] Improvements to Beam/Spark quickstart. (#17129)

[chamikaramj] Disable BigQueryIOStorageWriteIT for Runner v2 test suite


------------------------------------------
[...truncated 853.49 KB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dis
Apr 09, 2022 4:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-09T16:00:52.086Z: Cancel request is committed for workflow job: 2022-04-09_05_45_55-9203481995567457931.
Apr 09, 2022 4:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-09T16:00:52.176Z: Cleaning up.
Apr 09, 2022 4:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-09T16:00:52.280Z: Stopping **** pool...
Apr 09, 2022 4:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-09T16:00:52.325Z: Stopping **** pool...
Apr 09, 2022 4:01:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-09T16:01:46.020Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 09, 2022 4:01:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-09T16:01:46.060Z: Worker pool stopped.
Apr 09, 2022 4:01:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-09_05_45_55-9203481995567457931 finished with status CANCELLED.
Load test results for test (ID): 6b7f91f0-abdb-43cc-8a5d-d59bfa48c6f2 and timestamp: 2022-04-09T12:45:45.560000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11529.946
dataflow_v2_java11_total_bytes_count             4.01203839E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220409124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:91488cb19b21f3952c6af34cbf79b9ca5f39dd8a86e399f1364049a3bf0cfb32
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220409124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:91488cb19b21f3952c6af34cbf79b9ca5f39dd8a86e399f1364049a3bf0cfb32]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220409124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:91488cb19b21f3952c6af34cbf79b9ca5f39dd8a86e399f1364049a3bf0cfb32])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:91488cb19b21f3952c6af34cbf79b9ca5f39dd8a86e399f1364049a3bf0cfb32
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:91488cb19b21f3952c6af34cbf79b9ca5f39dd8a86e399f1364049a3bf0cfb32
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:91488cb19b21f3952c6af34cbf79b9ca5f39dd8a86e399f1364049a3bf0cfb32].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 32s
109 actionable tasks: 75 executed, 30 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/gnp6a6mogytgc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #293

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/293/display/redirect?page=changes>

Changes:

[johnjcasey] [BEAM-10529] add java and generic components of nullable xlang tests

[johnjcasey] [BEAM-10529] fix test case

[johnjcasey] [BEAM-10529] add coders and typehints to support nullable xlang coders

[johnjcasey] [BEAM-10529] update external builder to support nullable coder

[johnjcasey] [BEAM-10529] clean up coders.py

[johnjcasey] [BEAM-10529] add coder translation test

[johnjcasey] [BEAM-10529] add additional check to typecoder to not accidentally

[johnjcasey] [BEAM-10529] add test to retrieve nullable coder from typehint

[johnjcasey] [BEAM-10529] run spotless

[johnjcasey] [BEAM-10529] add go nullable coder

[johnjcasey] [BEAM-10529] cleanup extra println

[johnjcasey] [BEAM-10529] improve comments, clean up python

[bulat.safiullin] [BEAM-13992] [Website] update Contribute/Code Contribution Guide page

[bulat.safiullin] [BEAM-13992] [Website] change text, transfer tag a

[bulat.safiullin] [BEAM-13992] [Website] change code tags

[bulat.safiullin] [BEAM-13992] [Website] change text

[bulat.safiullin] [BEAM-13992] [Website] change text and links, add empty lines

[bulat.safiullin] [BEAM-13991] [Website] change links, add contribute file

[bulat.safiullin] [BEAM-13991] [Website] add content, add styles

[bulat.safiullin] [BEAM-13991] [Website] add images, add styles, delete spaces

[bulat.safiullin] [BEAM-13991] [Website] change url and aliases, delete bullet points

[bulat.safiullin] [BEAM-13991] [Website] add empty line

[bulat.safiullin] [BEAM-13992] [Website] change links, change text

[bulat.safiullin] [BEAM-13992] [Website] change links, add text, add dots

[bulat.safiullin] [BEAM-13992] [Website] change links, change text

[bulat.safiullin] [BEAM-13991] [Website] change styles, change quotes

[bulat.safiullin] [BEAM-13991] [Website] change link color

[bulat.safiullin] [BEAM-13992] [Website] change text, delete whitespace

[bulat.safiullin] [BEAM-13991] [Website] change text

[bulat.safiullin] [BEAM-13992] [Website] update text

[bulat.safiullin] [BEAM-13991] [Website] added changes from PR 13992, changed get-starting

[shivrajw] [BEAM-14236] Parquet IO support for list to conform with Apache Parquet

[chamikaramj] Sets 'sdk_harness_container_images' property for all Dataflow jobs -

[chamikaramj] Sets 'sdk_harness_container_images' property for all Dataflow jobs -

[mmack] [BEAM-14104] Support shard aware aggregation in Kinesis writer.

[noreply] [BEAM-13015] Lookup the container for the step once when registering

[noreply] [BEAM-14175] Log read loop abort at debug rather than error (#17183)

[noreply] [BEAM-11745] Fix author list rendering (#17308)

[noreply] [BEAM-14144] Record JFR profiles when GC thrashing is detected (#17151)

[noreply] Factors enable_prime flag in when checking use_unified_worker conditions

[noreply] [BEAM-11104] Add ProcessContinuation type to Go SDK (#17265)

[noreply] BEAM-13939: Restructure Protos to fix namespace conflicts (#16961)

[noreply] [BEAM-14270] Mark {Snowflake/BigQuery}Services as @Internal (#17309)

[noreply] [BEAM-13901] Add unit tests for graphx/cogbk.go

[noreply] [BEAM-14259, BEAM-14266] Remove unused function, replace use of ptypes

[noreply] [BEAM-14274] Fix staticcheck warnings in pipelinex (#17311)

[noreply] [BEAM-13857] Switched Go IT script to using Go flags for expansion

[noreply] Update python beam-master container image. (#17313)


------------------------------------------
[...truncated 755.83 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/worke
Apr 08, 2022 3:19:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T15:19:37.446Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Apr 08, 2022 3:19:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T15:19:39.208Z: Worker configuration: e2-standard-2 in us-central1-b.
Apr 08, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T16:00:42.847Z: Cancel request is committed for workflow job: 2022-04-08_05_45_42-2041097195106622571.
Apr 08, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T16:00:42.874Z: Cleaning up.
Apr 08, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T16:00:43.032Z: Stopping **** pool...
Apr 08, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T16:00:43.155Z: Stopping **** pool...
Apr 08, 2022 4:01:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T16:01:37.851Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 08, 2022 4:01:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-08T16:01:37.887Z: Worker pool stopped.
Apr 08, 2022 4:01:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-08_05_45_42-2041097195106622571 finished with status CANCELLED.
Load test results for test (ID): b05126ee-2dab-4a31-ae6c-24ab590b1100 and timestamp: 2022-04-08T12:45:36.735000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11561.262
dataflow_v2_java11_total_bytes_count             2.98809411E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220408124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3c1c419433272497a573127bfe64b5841194c123546630eef250427c6b3b70d4
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220408124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3c1c419433272497a573127bfe64b5841194c123546630eef250427c6b3b70d4]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220408124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3c1c419433272497a573127bfe64b5841194c123546630eef250427c6b3b70d4])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3c1c419433272497a573127bfe64b5841194c123546630eef250427c6b3b70d4
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3c1c419433272497a573127bfe64b5841194c123546630eef250427c6b3b70d4
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3c1c419433272497a573127bfe64b5841194c123546630eef250427c6b3b70d4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 27s
109 actionable tasks: 75 executed, 30 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/bxg5gzdvr7abm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 292 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 292 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/292/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #291

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/291/display/redirect?page=changes>

Changes:

[bingyeli] update query

[Robert Bradshaw] [BEAM-14250] Fix coder registration for types defined in __main__.

[johnjcasey] [BEAM-14256] update SpEL dependency to 5.3.18.RELEASE

[johnjcasey] [BEAM-14256] remove .RELEASE

[dannymccormick] Fix dependency issue causing failures

[Kyle Weaver] [BEAM-9649] Add region option to Mongo Dataflow test.

[noreply] Allow get_coder(None).

[noreply] [BEAM-13015] Disable retries for fnapi grpc channels which otherwise

[noreply] [BEAM-13952] Sickbay

[noreply] BEAM-14235 parquetio module does not parse PEP-440 compliant Pyarrow

[noreply] [Website] Contribution guide page indent bug fix (#17287)

[noreply] [BEAM-10976] Document go sdk bundle finalization (#17048)

[noreply] [BEAM-13829] Expose status API from Go SDK Harness (#16957)


------------------------------------------
[...truncated 1.03 MB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling eve
Apr 06, 2022 4:02:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-06T16:02:13.696Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 06, 2022 4:02:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-06T16:02:13.733Z: Worker pool stopped.
Apr 06, 2022 4:02:19 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-06_05_46_55-4508266517087523086 finished with status CANCELLED.
Load test results for test (ID): 143f8abc-73d9-4b99-8094-83b227b4f62a and timestamp: 2022-04-06T12:46:47.409000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11528.749
dataflow_v2_java11_total_bytes_count             3.68830994E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220406124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b9c052bd6e2171b15a461fc07e6ad6bf6874a8ab9f4e13d19813150a3c85c6cb
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220406124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b9c052bd6e2171b15a461fc07e6ad6bf6874a8ab9f4e13d19813150a3c85c6cb]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220406124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b9c052bd6e2171b15a461fc07e6ad6bf6874a8ab9f4e13d19813150a3c85c6cb])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b9c052bd6e2171b15a461fc07e6ad6bf6874a8ab9f4e13d19813150a3c85c6cb
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b9c052bd6e2171b15a461fc07e6ad6bf6874a8ab9f4e13d19813150a3c85c6cb
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b9c052bd6e2171b15a461fc07e6ad6bf6874a8ab9f4e13d19813150a3c85c6cb].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 6s
109 actionable tasks: 75 executed, 30 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ck7rhv4q7gguo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #290

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/290/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-8970] Add docs to run wordcount example on portable Spark Runner

[Kiley Sok] Update python container version

[benjamin.gonzalez] [BEAM-8970] Add period to end of sentence

[Kyle Weaver] Add self-descriptive message for expected errors.

[noreply] Add --dataflowServiceOptions=enable_prime to useUnifiedWorker conditions

[noreply] [BEAM-10529] nullable xlang coder (#16923)

[noreply] Fix go fmt break in core/typex/special.go (#17266)

[noreply] [BEAM-5436] Add doc page on Go cross compilation. (#17256)

[noreply] Pr-bot Don't count all reviews as approvals (#17269)

[noreply] Fix postcommits (#17263)

[noreply] [BEAM-14241] Address staticcheck warnings in boot.go (#17264)

[noreply] [BEAM-14157] GrpcWindmillServer: Use stream specific boolean to do

[noreply] [BEAM-10582] Allow (and test) pyarrow 7 (#17229)

[noreply] [BEAM-13519] Solve race issues when the server responds with an error


------------------------------------------
[...truncated 1.72 MB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Apr 05, 2022 4:02:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-05T16:02:14.448Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 05, 2022 4:02:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-05T16:02:14.491Z: Worker pool stopped.
Apr 05, 2022 4:02:19 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-05_05_45_43-2506730444122301441 finished with status CANCELLED.
Load test results for test (ID): 2c253c62-4e8c-44af-b8bc-3c8e4c587d5a and timestamp: 2022-04-05T12:45:31.764000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11519.292
dataflow_v2_java11_total_bytes_count             2.52596736E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220405124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f2db508b0e27636e7fa361f27383558175bdf4db5e9c472323d9c93767611a0e
Deleted: sha256:112455dd743255cd4142c71bf65465bd4d589fe9bd330094e1e06f4333d9bbef
Deleted: sha256:17c97defd89aa909f174418271cc5658ca149c4b47a3abb60660ee8de1dbd51a
Deleted: sha256:721415831a394aafc22130a19812af3e37f11a0186d94f04fc2f91b54d22f62b
Deleted: sha256:f06102cabaa495b3a0e5ea4eba537dd7f56dd103d4e9b52efd38fe87b6c3cade
Deleted: sha256:b5a53f985fc6053e40b2ed2bb5cc24b9f8c42173c3d68bf6050ac5def1d4afd1
Deleted: sha256:a7736871842c582c8c11b2a1ebcbb77b7a71045b1d6cf40dc6a37be320fea2ce
Deleted: sha256:1733a3bd430700bf753b7f9c79b62068f265fdae77b92ad78297c11eee9bcaeb
Deleted: sha256:02d0d9f0fba75e759edd1e0c98ff306b9d29ff7363a9568bae8377042892b4e2
Deleted: sha256:f03bfe19ce440d014f48b8afdb9b4c4ad47576bf32d0cbbdb23f3c985ee96dc5
Deleted: sha256:e73a7ac2bb9a6e549ab94d5c213a6f67b8aad589309fb263a94cbfb095f681c3
Deleted: sha256:b82c89170f8415fc881c47ae757ecc782f575f9f2c51cb710a8d5e07788e210a
Deleted: sha256:3f2a85cc0d4f424871ce1388346ad288a437892a620d11eb28cfadfbb493690f
Deleted: sha256:c24a55c429436b1d31be48533882bf0eeb050c18ab9a1f0dbd728545535352b8
Deleted: sha256:0070cf30f1d0e43830fa85bda23941c6b7325b59c6d041bdca8354aa7d298191
Deleted: sha256:085a41c30fc6afafe12ff9ec454fdf7f5f94ed6151a99c2b365907364687d0b9
Deleted: sha256:0b74321c6e8002ea242f764ad71f8fb66b6b1ffbb2bccbefd74eed829e9d34f9
Deleted: sha256:f9f178369cd830cf5befb36924c607c76c33ad1699ec2ecda1903656e8b27c05
Deleted: sha256:534fd263d803cca9505474b811759c44eed532ac77959280a902fae9dadf5b96
Deleted: sha256:fab95d723decb47c6678362f8b3c2b25198ce173d51150542a84827e3aec993d
Deleted: sha256:8a10a54ee17632ccf1ae55a926e389899aff573a36fa2bf0f963f27f6cf3fe69
Deleted: sha256:19f7e422b7e9d8b29098e31a03665e92e55543cfc903aa30b352635d21415309
Deleted: sha256:feabc6d7ed3bdc59eab53ec8689658831f9d34f90be6859970ea6e116f7d9fb1
Deleted: sha256:0182acb58f71d292ad001dc7a60952f2e35e8d60c39c0039c864c9556c8896e6
Deleted: sha256:835372d2c0fdcd58cb04618ba76d684b2d67504d8464d8f51c4f691370ce3331
Deleted: sha256:13dd1cb62b871d715035430c68ab841ac243ccc5e10bb40655b5ab2bb91d17ca
Deleted: sha256:9915e9f5ed63f1c84a6836b98d38c3e7248bb89539d9e8517c0f177d5fe8eaa8
Deleted: sha256:17f8b2b336daf04f19ca589e1b763da38863e48812d4ff68532063fd9bae20b8
Deleted: sha256:0c1841dbfb9448a4cce79709d00d02109dcac7b2b103f6a385ff74e11c95c252
Deleted: sha256:fa2542e854678f75c16c88e6a81c649abe293b5066767322290b61986944d2e5
Deleted: sha256:5d27c6fa4fa3363850d33862864957008dc0e8e0d22142593fc65474fdea5596
Deleted: sha256:a95953eef7e697fde1172cab2688024a7a4de808a8f92b9a2028a693dec6d39e
Deleted: sha256:5aaabbc9fe4058ce11c463dd7750c1ff5190f223eb3247fe13d4c66ca0dc0efe
Deleted: sha256:daaa950aeef4e4df143cf9c9bbb57a8c3fa652bd5bb9cbdcc4fa3b829f0f180b
Deleted: sha256:03d5be2a345222358e64806164d2607ba3ccc4369fd3e5479df4a62e63d53af9
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220405124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f2db508b0e27636e7fa361f27383558175bdf4db5e9c472323d9c93767611a0e]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220405124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f2db508b0e27636e7fa361f27383558175bdf4db5e9c472323d9c93767611a0e])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f2db508b0e27636e7fa361f27383558175bdf4db5e9c472323d9c93767611a0e
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f2db508b0e27636e7fa361f27383558175bdf4db5e9c472323d9c93767611a0e
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f2db508b0e27636e7fa361f27383558175bdf4db5e9c472323d9c93767611a0e].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef2e32b5e5b62e6653188e891ea43a7f16a95758d2a2a60ca5a5bf9a17408f92
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef2e32b5e5b62e6653188e891ea43a7f16a95758d2a2a60ca5a5bf9a17408f92
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Tue, 05 Apr 2022 16:02:28 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:ef2e32b5e5b62e6653188e891ea43a7f16a95758d2a2a60ca5a5bf9a17408f92': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 5s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/c4rn47r3mfank

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 289 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 289 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/289/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #288

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/288/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14143] Simplifies the ExternalPythonTransform API (#17101)


------------------------------------------
[...truncated 295.62 KB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Apr 03, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-03T16:00:39.377Z: Cancel request is committed for workflow job: 2022-04-03_05_45_57-2963163796664725090.
Apr 03, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-03T16:00:39.462Z: Cleaning up.
Apr 03, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-03T16:00:39.578Z: Stopping **** pool...
Apr 03, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-03T16:00:39.622Z: Stopping **** pool...
Apr 03, 2022 4:03:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-03T16:03:08.357Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 03, 2022 4:03:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-03T16:03:08.396Z: Worker pool stopped.
Apr 03, 2022 4:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-03_05_45_57-2963163796664725090 finished with status CANCELLED.
Load test results for test (ID): 908a0718-a56d-455d-92d1-7aebf9e8d896 and timestamp: 2022-04-03T12:45:45.253000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11533.827
dataflow_v2_java11_total_bytes_count             3.34972313E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220403124341
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8c0a2a38bc4b8bccd8befa15a4737e8f910c6f9b59356c955de1dfd75c7922c9
Deleted: sha256:f4af704a14a04a1c49951ee7c0ccf75c49811872a791471daf8125b3faf3b282
Deleted: sha256:215590b381a56dab37e41c20636738d9c151fa076d68de02e4f86fb2b9dac0bb
Deleted: sha256:5b272d0dd4f81c5a5032b3251e3b13a7e3b1bad7c81aede4f0067545bff4ca02
Deleted: sha256:8ab8ed78caea516a65eb3bec5dabac1ef77cdca1494fd68d52c538ce1d656459
Deleted: sha256:6fdc1dcfb3297c96386e9f74c59cc9c80021cff9af439bb41de8bb428b46febd
Deleted: sha256:26362c5629effd8032c9c5ef9150708104dc6be6c2fd664a49665cd2f1c6ca0c
Deleted: sha256:5f5be13865d46997c8d6bc58da30428a1858eca493bcb34197dafe1186b4be3c
Deleted: sha256:d19b2d69cc6efe290ebe05137c7a70cbb2365b1a9a62dd865c1f7cd48de78e96
Deleted: sha256:1b760c0e2f771fb0a5a44aeb09995df9d5af3f98a57407c55430384a8cd91eea
Deleted: sha256:b8c50573217a96da8e29cd640507e8e17d3cd49e35182befbf2fd1ebf9031cc9
Deleted: sha256:9f3810cea69ecaae51a74f013f9222d61695948ee473010589dcfd4d42f4bcae
Deleted: sha256:ec7736eeb5fefa350191e149367e3b386e179b4f903fc062bc0ec0e94ec872a0
Deleted: sha256:cc499e4c26ac1c671afa5346eae4d4948048fd91d2d78b0810faf71186e542dc
Deleted: sha256:a6857b0028659a257dd7537cc7651c03a94c88889037792ef341dc1966dbdff3
Deleted: sha256:0a74abd474793dee7255f258e8dbded2616f8fbadf519e6bfc47b8011b39d739
Deleted: sha256:8c12066a4a418438ce6e6d99eba61ee886154e73dd3ce9230a5d25b10699b125
Deleted: sha256:142a9feed93e8b1b372feda4e21d19ec7f6ad11a2c899d11eda00fdcf2208efe
Deleted: sha256:9521422e0a28c8619d124f272b96c7c98a2d6fb272bd38f78df6a13fb8b56db9
Deleted: sha256:e0829c03d5bff6d9decf0cb25cf8a32cfb731cb45881387b627789c63b2cd50d
Deleted: sha256:314d5fd66875b9fc34ebf621faf01d9c3d18f450ebaddce632bd9a8ff7646333
Deleted: sha256:ef8e33bb60db5a540b9aa66440ee0f3198e2a71846ce3caa38c477bc2a169dd7
Deleted: sha256:c06e19645507d973677da891ce46c0d5b16763e1cd55c4d4fb32335c8b8ce475
Deleted: sha256:00da483fa65d17bd41c0860efb6f9a4a1218590eb7e8562deb10ece46cfb88c6
Deleted: sha256:93a772c4e73a7d6e2f4afa7af7a231a3592cb9e183e43bd0c964cdbfc693531b
Deleted: sha256:9f3eceb03b02923b1a37bb86b036ebf148b0b2f1ae78b2ed6df2756559f20b4e
Deleted: sha256:a057e013d733186eeb93addbf8ad8e18e33c09e364aba8add2f15ba6f1931ce2
Deleted: sha256:fe256c8331e385f987adde5cd5e284be692f9dc268258e6bb919ecca55e63e75
Deleted: sha256:acad1ad4e0f97d3034fd76aa52906a747046c0107c3e7615e52a9e8cfdee6705
Deleted: sha256:54375d878ca524d07e6e350e49cf28e659491fe52f80b1a152e13d9b9432c7c8
Deleted: sha256:eb45fd1069959fbf20e57303fce73bb8c6cbaa283143fec4fff58d39b2158ef7
Deleted: sha256:ce4cfb1d37de654731589b7f3b5edbecef2b5913f8f1ade53836435396ed7025
Deleted: sha256:f1e62eeb7f95347e8f233a51e08ff6fe121a0a8a3e3c4401a88c111375d92729
Deleted: sha256:554d202182eb1edd02816141f22de5282c2b10df55c38f5ddefe908141b9885c
Deleted: sha256:f0f29b17b8fc8d5a48e916b12e51b55098681e34390501ee9993109c848c0c26
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220403124341]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8c0a2a38bc4b8bccd8befa15a4737e8f910c6f9b59356c955de1dfd75c7922c9]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220403124341] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8c0a2a38bc4b8bccd8befa15a4737e8f910c6f9b59356c955de1dfd75c7922c9])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8c0a2a38bc4b8bccd8befa15a4737e8f910c6f9b59356c955de1dfd75c7922c9
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8c0a2a38bc4b8bccd8befa15a4737e8f910c6f9b59356c955de1dfd75c7922c9
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8c0a2a38bc4b8bccd8befa15a4737e8f910c6f9b59356c955de1dfd75c7922c9].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 55s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/5ytltnjxtw6lg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #287

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/287/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14133] Fix potential NPE in BigQueryServicesImpl.getErrorInfo

[Robert Bradshaw] Revert "Revert "[BEAM-14038] Auto-startup for Python expansion service.

[Robert Bradshaw] Skip failing test for now.

[Kyle Weaver] [BEAM-14225] load balance jenkins jobs

[noreply] [BEAM-14153] Reshuffled Row Coder PCollection used as Side Input cause

[noreply] delint go sdk (#17247)

[Heejong Lee] add test

[noreply] Merge pull request #16841 from [BEAM-8823] Make FnApiRunner work by

[noreply] [BEAM-14192] Update legacy container version (#17210)

[noreply] Fix mishandling of API with BQIO (#17211)

[noreply] [BEAM-14221] Update documentation with Flink on Dataproc features

[Kiley Sok] Revert "[BEAM-14190] Python sends dataflow schema field"


------------------------------------------
[...truncated 474.36 KB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Apr 02, 2022 4:00:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-02T16:00:46.592Z: Cancel request is committed for workflow job: 2022-04-02_05_46_03-17864023057532030584.
Apr 02, 2022 4:00:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-02T16:00:46.662Z: Cleaning up.
Apr 02, 2022 4:00:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-02T16:00:46.731Z: Stopping **** pool...
Apr 02, 2022 4:00:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-02T16:00:46.783Z: Stopping **** pool...
Apr 02, 2022 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-02T16:03:07.152Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Apr 02, 2022 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-04-02T16:03:07.195Z: Worker pool stopped.
Apr 02, 2022 4:03:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-04-02_05_46_03-17864023057532030584 finished with status CANCELLED.
Load test results for test (ID): 12a65da8-3049-4dcd-8577-68d47d98366e and timestamp: 2022-04-02T12:45:53.146000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11546.232
dataflow_v2_java11_total_bytes_count             2.94911125E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220402124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:930991935b5007b4f00f10de3dc0717dc720638ac06b7d902a18065f8d83cc44
Deleted: sha256:6ea269a41de50d3a47a1dea248db227d94356813a7eb948caa3281993ac4e4c7
Deleted: sha256:0c7f7996f9b426f5ad46cc09308223f7e5e8cc0b3f19f6d17d33310858864da8
Deleted: sha256:28429ea6ed9493afd5f6bff7155bc7a397a0a6fbf587f66680553bbc13a5cd80
Deleted: sha256:eb092f05f2771fd31fe16646341b496a20a5873e5441104fd4ac516b1dd25a87
Deleted: sha256:bf6abc841977506a3a900ea806bc88adec104011517a5fe8163bd49f7d52664e
Deleted: sha256:93394312d41a80559413ec9f6f037205383e74597255777ed1b7e69236257957
Deleted: sha256:4ec9255cd4277d988ad72e929f163026fe80ddbb07df9aaada33535cff2e751b
Deleted: sha256:63dbab2ba1685eeba5aabbbb39cee5ad7d8274b062d5adcc946b31b211839528
Deleted: sha256:95e5f5e8db345865b4a847a81c19f15aa0f76eaadf241ad55ff527a4b9426d02
Deleted: sha256:76512dd68d63456e5c5c866bf023b8b982db93bf7c723d928f37123aaf9a9697
Deleted: sha256:0d911611f96dcff6568ccd6602bd5fa1ac17ae56e6fd3c49b9d9809c4879a37e
Deleted: sha256:8bb2ed8096716d7a2b96571e1a160431ec3aa623b514bc96b57b037962632a7f
Deleted: sha256:75c708005ee44038857c3e04f86fa5eac271aaabf1aa122e1f77caaa40e882d3
Deleted: sha256:e5101641bc454a102382d2c400dbc6acf633d59ab1b73fe99f04c44461f21320
Deleted: sha256:f45df805baaddfe82eeb8faeb5c2e11b0b706837522bc3d0bd558060fa538818
Deleted: sha256:357e75679900d549cbdef88a09a966ed3c8f299e66f92771b80edccd6fe099f7
Deleted: sha256:46cc871430866c82b7bfc70a648a509951e3707280580b2284262bd13162a11b
Deleted: sha256:feb042730c81888aa0d01861333519cca3bf0174812c4ca2310f540c13193792
Deleted: sha256:1cc87ee44f9adfb937bf3dc2b27c58cb085661da063bf1f2df06db88f3dd0ccd
Deleted: sha256:61effb045d284ee8e472e4a68bfe208da11bf1a7d680a23e40e2b197160cc0af
Deleted: sha256:73b8c14c06b77adda5239e9c2e00d7c9c6e1130cd38fd712ba14597e2f386b38
Deleted: sha256:733036b59a6f9991b3ff800c9b4e714b818d3eb998aded8829ef69fc573799a7
Deleted: sha256:f83e1c40a0b86173f8766de287dfa79ca47b0f9f8d7968a79e69f01194d1c0fe
Deleted: sha256:2de6488ed7c094ce5f699741d1297c3cfefccb4dc7c1b4f76abe425ca3242718
Deleted: sha256:37ed12301fb5ee89d4fd9bbacfa49355afb3da2effee258378e72c3b4ef2ac41
Deleted: sha256:1193a43f85f8fc7320ffde09b142a124fcfcab29ff21be96c0fd00bce6df2e85
Deleted: sha256:4eb8df06ceddccdc11a5472f6bb349ec1e04b1f040409a8dd7135b4cbc801aea
Deleted: sha256:2bff67ba2781d404a5caf1cb4afeea8bc353f1e411011a78c9c3b534459f0d89
Deleted: sha256:5316e98fe772fd801fea16b665257bb1938c90d6bcd608f8caf277d9be31df36
Deleted: sha256:e45369fb626a35bdd942790842e100a5e1d8482ec145b0d7d5624875c3e9afcf
Deleted: sha256:d9b86ce690804e9f35fdc86d15b5f5390383e093bc5975f040aae92ba87b053b
Deleted: sha256:e2f361049063557d966413c6d910575ed8ff4e557972173f2b2e5e12cc0c1d01
Deleted: sha256:b351f74177a0565be63a6997b149b17f61f44d226b64bbbe7a289c61dd7528fe
Deleted: sha256:6d16d4c8b2ea5a5bd943a60f6830c71e2b4ec7c9f12eca72f3b971965b048085
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220402124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:930991935b5007b4f00f10de3dc0717dc720638ac06b7d902a18065f8d83cc44]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220402124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:930991935b5007b4f00f10de3dc0717dc720638ac06b7d902a18065f8d83cc44])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:930991935b5007b4f00f10de3dc0717dc720638ac06b7d902a18065f8d83cc44
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:930991935b5007b4f00f10de3dc0717dc720638ac06b7d902a18065f8d83cc44
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:930991935b5007b4f00f10de3dc0717dc720638ac06b7d902a18065f8d83cc44].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 1s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/syn3jx7zflpzu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 286 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 286 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/286/ to view the results.

beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 285 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 285 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/285/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #284

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/284/display/redirect?page=changes>

Changes:

[vachan] Update display data to include BQ information.

[noreply] Revert "[BEAM-14084] iterable_input_value_types changed from list to

[chamikaramj] Convert URLs to local jars when constructing filesToStage

[Valentyn Tymofieiev] Ensure the removed option prebuild_sdk_container_base_image not used on

[noreply] [BEAM-13314]Revise recommendations to manage Python pipeline

[noreply] Merge pull request #17202 from [BEAM-14194]: Disallow autoscaling for

[noreply] Merge pull request #17080 from [BEAM-13880] [Playground] Increase test

[noreply] Merge pull request #17050 from [BEAM-13877] [Playground] Increase test

[noreply] [BEAM-14200] Improve SamzaJobInvoker extensibility (#17212)

[noreply] Merge pull request #17148 from [BEAM-14042] [playground] Scroll imports

[noreply] [BEAM-13918] Increase datastoreio go sdk unit test coverage (#17173)

[noreply] Merge pull request #16819: [BEAM-13806] Adding test suite for Go x-lang


------------------------------------------
[...truncated 821.31 KB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:706
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/cli
Mar 30, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-30T16:00:42.671Z: Cancel request is committed for workflow job: 2022-03-30_05_46_38-9178594846917040139.
Mar 30, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-30T16:00:42.732Z: Cleaning up.
Mar 30, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-30T16:00:42.814Z: Stopping **** pool...
Mar 30, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-30T16:00:42.859Z: Stopping **** pool...
Mar 30, 2022 4:03:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-30T16:03:16.459Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 30, 2022 4:03:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-30T16:03:16.512Z: Worker pool stopped.
Mar 30, 2022 4:03:22 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-30_05_46_38-9178594846917040139 finished with status CANCELLED.
Load test results for test (ID): d131cef7-ba52-45e5-8af3-00e4172f9140 and timestamp: 2022-03-30T12:46:22.016000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11473.976
dataflow_v2_java11_total_bytes_count             3.19157773E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220330124343
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef6576384c88cf5981db22de19518b8f3e5374815b59df7cbe9a5484bd4ceec8
Deleted: sha256:b59d92435feb90d57c7f3761059e6254af98f044659624a5d8411a8ac260d6ff
Deleted: sha256:832cefbcca2675047fb3e718ac4652e1947794378fff732ee2c703442a7ecbb3
Deleted: sha256:379dcedff93d4bccbd6a780129d54db08a50145269a96f4baba8bedfb0095e6c
Deleted: sha256:0e506b1f340493748b011150c0be9133e2fe0528e918dfa3172f766067111c26
Deleted: sha256:cf82fc4972bc84d08cbdffdc74d6552896bd03357f7dbd473cdbf3a00a0c5fb7
Deleted: sha256:34b758711162e8d6d62d960b45ac865f00cfb4ccb1caeb82a567c2a9d30bd2bd
Deleted: sha256:0e2d81a169ecc59f886783cd6a198a698bc339fcbf08e7912a9963d788f5a656
Deleted: sha256:895866a738d26143e8f784247ae8785e825144f75dcc5bc6ab7eda520378c560
Deleted: sha256:e48db740ae37e8cc53278eb8ae63986874a920a9ac4c3d093ab0a4c72ad05044
Deleted: sha256:2d8b359827c8910bfa0672679db725d38e39dca433182a4b07ef8ee3b3d7fc64
Deleted: sha256:9c9c907c2f2a0db5e275fc99383c2ec24d5886adca993bbcec6a4466cfa0100a
Deleted: sha256:7a893255b1bd330b16a642f1a5acc32a377c863f2ac6d3404cb076a18dc6ed76
Deleted: sha256:116250576388b95d8c1d538bc571be2a421e1bc8f49cb12131998b3c5f42dc38
Deleted: sha256:dc3dbbf0fe40f82a2b44bd9228070305357b84702ceeb26f00fbbdc547204ba7
Deleted: sha256:c551afcdd1c89688461644d553375fc137986db949e556282ce47773d4dc3726
Deleted: sha256:ff2eaf6f109ace3e75263045485f3b63e0385409135414a56b6e76c989d06f2a
Deleted: sha256:bb4a67f0ab8b98fb8c2dc55b540f932414a9d1d8166ff7b32dd423c32074ad39
Deleted: sha256:c0e3f37222352507e090a10c77bf0fedc0b4e9a387fd1bb061b41ab52484d022
Deleted: sha256:28a1b122c796a093a1e8a309a4e537be521c1ba73e8d5fef6b7ff476814c0b4c
Deleted: sha256:71685098caccd538111ee4b338d58d91e1beb424a2a63181393d14917551d6e8
Deleted: sha256:20ad25f0e21848edf3cebbc107976c03852821559393f969f42c90f2dac6634d
Deleted: sha256:0fdfd038be7d6e21f2bd6026dbcc136849619fbfd6c550edbb24d6c8d287db93
Deleted: sha256:91a9d1885fd4e8c229a05cdf64a1854730a851abb1433d6604c5295534e8a23f
Deleted: sha256:5f9ab137519dbf6d007438362c3e873a8ca49fc1b800a4e9e7a3cd8382c2504f
Deleted: sha256:9e35fc3474d3b8189968d8114faf606fd53f6e0977eb7d1fb665d0874133ee2a
Deleted: sha256:dfeabd97d9d3afe33a7678038208e0390fa58d6a416cff0b0ff05d317c2ed20f
Deleted: sha256:0accfedad215a51ceaed31ed0e5ae0684990137d35ca95d83e86c15b31afe494
Deleted: sha256:61dfc7b0bad16bcb99a33336f2f4f69c05aff80849b942eb12bc07604599b9b9
Deleted: sha256:ce9242559fe6bbc228d458b419d02d647577909a6699ad52552685c8fc50d63a
Deleted: sha256:a5c898284925a7eca78f173d19f9742a0f2c64cda17b1c44b652c031b008c425
Deleted: sha256:80488eff4fc5e32d8ab5d4a39d406012d3cad13eadbf9af4cb749cbd2af39503
Deleted: sha256:233acb9cc03a05bd97906f6de130f0d11f5f60c3a28345743103de30ee190570
Deleted: sha256:dff26e1ee1973caa4b67a0d9081ae065ebe3f18ac28af6958a5a1d24747ddd48
Deleted: sha256:d66f3c5646e0e50526a798e870b15f588e7f4af2de8bf7674f29acd7b25817d0
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220330124343]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef6576384c88cf5981db22de19518b8f3e5374815b59df7cbe9a5484bd4ceec8]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220330124343] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef6576384c88cf5981db22de19518b8f3e5374815b59df7cbe9a5484bd4ceec8])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef6576384c88cf5981db22de19518b8f3e5374815b59df7cbe9a5484bd4ceec8
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef6576384c88cf5981db22de19518b8f3e5374815b59df7cbe9a5484bd4ceec8
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef6576384c88cf5981db22de19518b8f3e5374815b59df7cbe9a5484bd4ceec8].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13907352a71480629641f00abc705583ef645ae17d526a6741011f9942ec9435
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13907352a71480629641f00abc705583ef645ae17d526a6741011f9942ec9435
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Wed, 30 Mar 2022 16:03:31 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:13907352a71480629641f00abc705583ef645ae17d526a6741011f9942ec9435': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 10s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/zypmqrlxpgwek

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #283

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/283/display/redirect?page=changes>

Changes:

[hengfeng] feat: remove the metadata table after the pipeline finishes

[thiagotnunes] test: add test for metadata table dropping

[noreply] Minor: Add warning about pubsub client to Beam 2.36.0 blog (#17188)

[noreply] [BEAM-14177] Fix GBK re-iteration caching for portable runners. (#17184)

[noreply] Merge pull request #17187: [BEAM-14181] Make sure to evict connections

[noreply] Only reset transform.label if it is correctly assigned (#17192)

[noreply] [BEAM-12641] Use google-auth instead of oauth2client for GCP auth

[Robert Bradshaw] [BEAM-14163] Fix typo in single core per container logic.

[thiagotnunes] test: disable SpannerIO.readChangeStream test

[noreply] Merge pull request #17164 from [BEAM-14140][Playground] Fix Deploy

[noreply] Merge pull request #16855 from [BEAM-13938][Playground] Increase test


------------------------------------------
[...truncated 1.22 MB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 29, 2022 4:03:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-29T16:03:57.001Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 29, 2022 4:03:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-29T16:03:57.041Z: Worker pool stopped.
Mar 29, 2022 4:04:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-29_05_45_42-3420679915442567582 finished with status CANCELLED.
Load test results for test (ID): 0b0d09bb-6ff1-4ccb-b209-85e719dba5f0 and timestamp: 2022-03-29T12:45:36.049000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11600.794
dataflow_v2_java11_total_bytes_count             3.46139551E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220329124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9128757ea1d0c2012c85e5d3885511734ddd2c1796db89ae37c5514e81098d62
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220329124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9128757ea1d0c2012c85e5d3885511734ddd2c1796db89ae37c5514e81098d62]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220329124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9128757ea1d0c2012c85e5d3885511734ddd2c1796db89ae37c5514e81098d62])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9128757ea1d0c2012c85e5d3885511734ddd2c1796db89ae37c5514e81098d62
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9128757ea1d0c2012c85e5d3885511734ddd2c1796db89ae37c5514e81098d62
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9128757ea1d0c2012c85e5d3885511734ddd2c1796db89ae37c5514e81098d62].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 48s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/igevqjsgj2x66

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #282

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/282/display/redirect>

Changes:


------------------------------------------
[...truncated 48.91 KB...]
42a0228f5eff: Waiting
ff7681e6f667: Waiting
8e9a17c24b27: Preparing
c78d5f0db824: Preparing
4f317a0a2f77: Preparing
511ef2f24f7b: Waiting
517ee2bbfc94: Waiting
5cfdea2165bb: Waiting
3cddd9de99b4: Preparing
327e42081bbe: Preparing
e22383518335: Waiting
6e632f416458: Preparing
e019be289189: Preparing
c9a63110150b: Preparing
955b62aa3c71: Waiting
4f317a0a2f77: Waiting
8e9a17c24b27: Waiting
c78d5f0db824: Waiting
34dd462e9f6c: Waiting
3cddd9de99b4: Waiting
e019be289189: Waiting
c9a63110150b: Waiting
6e632f416458: Waiting
e01f6c731ce4: Pushed
0f893ef946a2: Pushed
06c98d22e319: Pushed
b1db51c0e3ba: Pushed
de39501648d0: Pushed
cb9dd7dfef31: Pushed
ff7681e6f667: Pushed
511ef2f24f7b: Pushed
42a0228f5eff: Pushed
e22383518335: Pushed
5cfdea2165bb: Pushed
517ee2bbfc94: Pushed
c78d5f0db824: Layer already exists
4f317a0a2f77: Layer already exists
3cddd9de99b4: Layer already exists
327e42081bbe: Layer already exists
6e632f416458: Layer already exists
e019be289189: Layer already exists
c9a63110150b: Layer already exists
34dd462e9f6c: Pushed
8e9a17c24b27: Pushed
955b62aa3c71: Pushed
20220328124335: digest: sha256:8ab0592c319a80611ebb313a3f4f7415a08345bc116f299ddd07bc4048b2dafa size: 4935

> Task :sdks:java:testing:load-tests:run
Mar 28, 2022 12:45:52 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Mar 28, 2022 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 207 files. Enable logging at DEBUG level to see which files will be staged.
Mar 28, 2022 12:45:53 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Mar 28, 2022 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Mar 28, 2022 12:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 207 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Mar 28, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 207 files cached, 0 files newly uploaded in 1 seconds
Mar 28, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Mar 28, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <115094 bytes, hash 0b429d10b60ad78250e238c05380b165f66d2f5946a9ad9b56d69b922f44dda0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-C0KdELYK14JQ4jjAU4CxZfZtL1lGqa2bVtabki9E3aA.pb
Mar 28, 2022 12:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Mar 28, 2022 12:45:59 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@625dfff3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26350ea2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e9469b8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a08efdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57272109, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59696551, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@648d0e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79e66b2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17273273, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f69e2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@984169e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f1bb92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d]
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Mar 28, 2022 12:46:00 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30c1da48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a65cd8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f1ef9d6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17461db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fd9e827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e682398, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@670b3ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24a86066, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54402c04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5b3bb1f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58d6b7b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f1a4795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a6f6c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c5ddccd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dbd580, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c101cc1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d0d91a1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb48179]
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.39.0-SNAPSHOT
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_05_46_00-6615587993976244565?project=apache-beam-testing
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-03-28_05_46_00-6615587993976244565
Mar 28, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-03-28_05_46_00-6615587993976244565
Mar 28, 2022 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-28T12:46:04.707Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-03-ka2y. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Mar 28, 2022 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:09.862Z: Worker configuration: e2-standard-2 in us-central1-b.
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.541Z: Expanding SplittableParDo operations into optimizable parts.
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.564Z: Expanding CollectionToSingleton operations into optimizable parts.
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.621Z: Expanding CoGroupByKey operations into optimizable parts.
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.680Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.719Z: Expanding GroupByKey operations into streaming Read/Write steps
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.793Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.893Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.927Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:10.978Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.011Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.049Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.078Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.106Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.134Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.172Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.203Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.236Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.293Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.324Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.358Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.384Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.419Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.456Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.479Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.499Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.529Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.565Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.600Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.633Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:11.814Z: Running job using Streaming Engine
Mar 28, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:12.080Z: Starting 5 ****s in us-central1-b...
Mar 28, 2022 12:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:41.361Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Mar 28, 2022 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:46:51.745Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Mar 28, 2022 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T12:47:52.619Z: Workers have started successfully.
Mar 28, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T16:00:48.580Z: Cancel request is committed for workflow job: 2022-03-28_05_46_00-6615587993976244565.
Mar 28, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T16:00:48.626Z: Cleaning up.
Mar 28, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T16:00:48.685Z: Stopping **** pool...
Mar 28, 2022 4:00:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T16:00:48.778Z: Stopping **** pool...
Mar 28, 2022 4:03:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T16:03:13.831Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 28, 2022 4:03:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-28T16:03:13.871Z: Worker pool stopped.
Mar 28, 2022 4:03:20 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-28_05_46_00-6615587993976244565 finished with status CANCELLED.
Load test results for test (ID): 7e45512f-0866-4d5f-8ca0-ddbe945786e2 and timestamp: 2022-03-28T12:45:53.168000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11547.548
dataflow_v2_java11_total_bytes_count             2.08803436E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220328124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8ab0592c319a80611ebb313a3f4f7415a08345bc116f299ddd07bc4048b2dafa
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220328124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8ab0592c319a80611ebb313a3f4f7415a08345bc116f299ddd07bc4048b2dafa]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220328124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8ab0592c319a80611ebb313a3f4f7415a08345bc116f299ddd07bc4048b2dafa])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8ab0592c319a80611ebb313a3f4f7415a08345bc116f299ddd07bc4048b2dafa
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8ab0592c319a80611ebb313a3f4f7415a08345bc116f299ddd07bc4048b2dafa
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:8ab0592c319a80611ebb313a3f4f7415a08345bc116f299ddd07bc4048b2dafa].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 7s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/h34fnxbp6epg2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #281

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/281/display/redirect>

Changes:


------------------------------------------
[...truncated 1016.92 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 27, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-27T16:00:43.325Z: Cancel request is committed for workflow job: 2022-03-27_05_45_31-8015577374502407750.
Mar 27, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-27T16:00:43.381Z: Cleaning up.
Mar 27, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-27T16:00:43.475Z: Stopping **** pool...
Mar 27, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-27T16:00:43.518Z: Stopping **** pool...
Mar 27, 2022 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-27T16:03:00.407Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 27, 2022 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-27T16:03:00.445Z: Worker pool stopped.
Mar 27, 2022 4:03:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-27_05_45_31-8015577374502407750 finished with status CANCELLED.
Load test results for test (ID): ec5a730c-c07d-48cf-aa0d-3fca7a6cfa6f and timestamp: 2022-03-27T12:45:25.081000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11556.458
dataflow_v2_java11_total_bytes_count             3.00635796E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220327124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:326a1fd5c05f65aaf1110ae9051ee734a6622abf2c4047e25fa29697e44b24fc
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220327124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:326a1fd5c05f65aaf1110ae9051ee734a6622abf2c4047e25fa29697e44b24fc]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220327124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:326a1fd5c05f65aaf1110ae9051ee734a6622abf2c4047e25fa29697e44b24fc])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:326a1fd5c05f65aaf1110ae9051ee734a6622abf2c4047e25fa29697e44b24fc
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:326a1fd5c05f65aaf1110ae9051ee734a6622abf2c4047e25fa29697e44b24fc
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:326a1fd5c05f65aaf1110ae9051ee734a6622abf2c4047e25fa29697e44b24fc].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 57s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4wbvgsnzwh276

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #280

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/280/display/redirect?page=changes>

Changes:

[ryanthompson591] iterable_input_value_types will now be an iterable, I don't anticipate

[marco.robles] [BEAM-8218] PulsarIO Connector

[benjamin.gonzalez] [BEAM-12572] Change examples jobs to run as cron jobs

[benjamin.gonzalez] [BEAM-12572] SpotlessApply

[Robert Bradshaw] [BEAM-14171] More explicit asserts in CoGBKResult.

[Robert Bradshaw] Add some comments.

[noreply] [BEAM-14160] Parse filesToStage in Java expansion service (#17167)

[chamikaramj] Mapped JOB_STATE_RESOURCE_CLEANING_UP to RESOURCE_CLEANING_UP in Python

[noreply] Explicitly import estimator from tensorflow (#17168)


------------------------------------------
[...truncated 86.29 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Mar 26, 2022 4:03:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-26T16:03:32.740Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 26, 2022 4:03:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-26T16:03:32.813Z: Worker pool stopped.
Mar 26, 2022 4:03:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-26_05_46_17-14546943808767005751 finished with status CANCELLED.
Load test results for test (ID): b69a95b7-708f-499e-a2a9-35bd81906ccd and timestamp: 2022-03-26T12:46:10.787000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11544.819
dataflow_v2_java11_total_bytes_count             4.18055427E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220326124342
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:31fd5590b2cfb26f685f3312e415bd8b3458105d8efc19c1eceb4c662d92fc3f
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220326124342]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:31fd5590b2cfb26f685f3312e415bd8b3458105d8efc19c1eceb4c662d92fc3f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220326124342] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:31fd5590b2cfb26f685f3312e415bd8b3458105d8efc19c1eceb4c662d92fc3f])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:31fd5590b2cfb26f685f3312e415bd8b3458105d8efc19c1eceb4c662d92fc3f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:31fd5590b2cfb26f685f3312e415bd8b3458105d8efc19c1eceb4c662d92fc3f
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:31fd5590b2cfb26f685f3312e415bd8b3458105d8efc19c1eceb4c662d92fc3f].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 22s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/yq2s5gewokqyu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #279

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/279/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-14139] Drop support for Flink 1.11.

[Kyle Weaver] [BEAM-14139] Remove obsolete reference to Flink 1.11.

[Kyle Weaver] [BEAM-14139] Update list of supported Flink versions.

[Kyle Weaver] [BEAM-14139] Update CHANGES.md

[noreply] [BEAM-14157] Don't call requestObserver.onNext on a closed windmill

[noreply] Minor: Make IOTypeHints a real NamedTuple (#17174)

[noreply] [BEAM-14172] Update tox.ini for pydocs (#17176)

[noreply] [BEAM-14065] Upgrade vendored bytebuddy to version 1.12.8 (#17028)


------------------------------------------
[...truncated 555.08 KB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend
Mar 25, 2022 12:59:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-25T12:59:05.957Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Mar 25, 2022 2:24:58 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Mar 25, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-25T16:00:43.785Z: Cancel request is committed for workflow job: 2022-03-25_05_45_32-12945564368059182213.
Mar 25, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-25T16:00:43.847Z: Cleaning up.
Mar 25, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-25T16:00:43.908Z: Stopping **** pool...
Mar 25, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-25T16:00:43.961Z: Stopping **** pool...
Mar 25, 2022 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-25T16:03:07.635Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 25, 2022 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-25T16:03:07.672Z: Worker pool stopped.
Mar 25, 2022 4:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-25_05_45_32-12945564368059182213 finished with status CANCELLED.
Load test results for test (ID): f58c3274-baa9-4541-84a7-4c2c9d3b366f and timestamp: 2022-03-25T12:45:27.720000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11569.111
dataflow_v2_java11_total_bytes_count             2.01588099E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220325124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5bf487e4a648978d68afd411a442b1cd14a98a0adfc30dd2db3eb9eb14d6f107
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220325124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5bf487e4a648978d68afd411a442b1cd14a98a0adfc30dd2db3eb9eb14d6f107]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220325124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5bf487e4a648978d68afd411a442b1cd14a98a0adfc30dd2db3eb9eb14d6f107])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5bf487e4a648978d68afd411a442b1cd14a98a0adfc30dd2db3eb9eb14d6f107
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5bf487e4a648978d68afd411a442b1cd14a98a0adfc30dd2db3eb9eb14d6f107
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5bf487e4a648978d68afd411a442b1cd14a98a0adfc30dd2db3eb9eb14d6f107].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 2s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/k2e4ai4cjiao6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #278

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/278/display/redirect?page=changes>

Changes:

[bulat.safiullin] [BEAM-13976] [Website] update homepage

[bulat.safiullin] [BEAM-13976] [Website] update homepage, add logo

[bulat.safiullin] [BEAM-13976] [Website] update text

[bulat.safiullin] [BEAM-13976] [Website] Update Community landing page

[bulat.safiullin] [BEAM-13979] [Website] Update Community/Contact us page

[bulat.safiullin] [BEAM-13979] [Website] update title

[bulat.safiullin] [BEAM-13979] [Website] delete space

[bulat.safiullin] [BEAM-13979] [Website] add Beam Playground

[bulat.safiullin] [BEAM-13976] [Website] delete Beam Playground

[bulat.safiullin] [BEAM-13976] [Website] change navbar css links rules, delete links from

[bulat.safiullin] [BEAM-13977] [Website] delete available-contact-channels on mobile

[bulat.safiullin] [BEAM-13976] [Website] change padding size between the sections

[bulat.safiullin] [BEAM-13976] [Website] change title to capital letters

[bulat.safiullin] [BEAM-13976] [Website] change title

[bulat.safiullin] [BEAM-14040] [Website] create new page, add link

[bulat.safiullin] [BEAM-13977] [Website] change title

[bulat.safiullin] [BEAM-13979] [Website] change text

[bulat.safiullin] [BEAM-13976] [Website] change text

[bulat.safiullin] [BEAM-13977] [Website] change text, add capital letters

[bulat.safiullin] [BEAM-13976] [Website] add playground sass, change text-align

[bulat.safiullin] [BEAM-14040] [Website] add io connectors table

[bulat.safiullin] [BEAM-13976] [Website] add playground section, add empty line

[bulat.safiullin] [BEAM-14040] [Website] add overflow to css, add table content

[bulat.safiullin] [BEAM-14040] [Website] change ✘ for ✔, add license, add br

[bulat.safiullin] [BEAM-14040] [Website] add empty line

[bulat.safiullin] [BEAM-14040] [Website] change td

[bulat.safiullin] [BEAM-14041] [Website] update built io transforms

[bulat.safiullin] [BEAM-14041] [Website] move connectors from Miscellaneous to Database

[bulat.safiullin] [BEAM-14040] [Website] change links color

[danielamartinmtz] Updated metrics' CronJob API to use the latest batch version.

[bulat.safiullin] [BEAM-14041] [Website] change IO from go to java

[bulat.safiullin] [BEAM-14040] [Website] change links, change specific version to current

[danielamartinmtz] Updated cluster to test in metrics-upgrade-clone in BeamMetrics_Publish

[aydar.zaynutdinov] [BEAM-13976][Website]

[aydar.zaynutdinov] [BEAM-14040][Website]

[aydar.zaynutdinov] [BEAM-14041][Website]

[danielamartinmtz] Updated StateFulSet k8s obejct in cassandra-svc-statefulset.yaml file in

[danielamartinmtz] Updated documentation including cluster specs.

[noreply] Beam 13058 k8s apis upgrade - elasticsearch (#18)

[danielamartinmtz] Removed code used for testing.

[danielamartinmtz] Removed code used for testing in job_PostCommit_BeamMetrics_Publish

[noreply] Beam 13058 k8s apis upgrade - Adding Basic Auth details in documentation

[Pablo Estrada] [BEAM-14151] Excluding Spanner CDC tests from Dataflow V1 suite

[danielamartinmtz] Added comments in initContainers and remove unused code in elasticsearch

[noreply] [BEAM-14134] Optimize memory allocations for various core coders

[noreply] [BEAM-14129] Restructure PubsubLiteIO Read side to produce smaller

[noreply] [BEAM-12697] Add primitive field generation from IR to SBE extension

[noreply] [BEAM-13889] Add test cases to jsonx package (#17124)

[noreply] Remove unreachable code in container.go (#17166)

[noreply] Add ability to handle streaming input to AvroSchemaIOProvider (#17126)

[noreply] [BEAM-12898] Flink Load Tests failure- UncheckedExecutionException -

[Daniel Oliveira] Moving to 2.39.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 663.20 KB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 24, 2022 4:03:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-24T16:03:31.718Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 24, 2022 4:03:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-24T16:03:31.756Z: Worker pool stopped.
Mar 24, 2022 4:03:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-24_05_51_30-13038354825805581537 finished with status CANCELLED.
Load test results for test (ID): 71be8c78-d03a-4883-8c73-278dff13b643 and timestamp: 2022-03-24T12:51:23.302000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11232.055
dataflow_v2_java11_total_bytes_count             3.66438955E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220324124845
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:23dd33b71af4dc30d27e6504abb9ad7a95ca776d900ce9bac2e911c239c6010d
Deleted: sha256:7fcbc4e45af465646e3d93f6bef3b4e4f271bf5f6ae422f7c770394e72f9cdd0
Deleted: sha256:d99645a53fdda7375528f01532d5c66902f4780b29fc197b51fd5bea3df3ffa7
Deleted: sha256:24e465da6bae7ebf1d4e81f62836fad53bbdc720e520467b48673ff2519795ba
Deleted: sha256:823be08ca78c26bbdbaba0663d02b5e680f5c1c5c2641342cb98f54c9380068d
Deleted: sha256:69692f30058de66929426407c75dd29a1c99d153a1509ad35f6068e3a95a3170
Deleted: sha256:7de27d46983da0c6b0e0093609ebd47cb2fc84239b6884fd4902a5f7c6954dcc
Deleted: sha256:7d0b4804cb46eb8656246dcb96762e029d43885999d4accbc7054847bfc4a6ae
Deleted: sha256:ec0162284c5cda0c57eba7bd5e5a97c9c376a1eecceddd64419056632fa42943
Deleted: sha256:5182e937d77c33faed25b6fd481e47456983007fcd414ec7bc8d6252b891bb73
Deleted: sha256:44c740c2a201a39c864d9b75c1be7e675f40e6e5f42ada0ab3c66e06668822ed
Deleted: sha256:a149dbfd9987d9d7681e69705d980f67811d59043deab3232158a396111aa687
Deleted: sha256:93a0b570c739a463f66df494dfadc403b1a9a91a9e36abfee3d0606a04fac738
Deleted: sha256:c15e734e285225ba302d82805c38eb7c4d967e7306b45736f051d1a67c246f28
Deleted: sha256:8f86e84e7ad8f893f341a26e2510de3264075b42ab28aae4a5316895aa8f2a3c
Deleted: sha256:ad8337f37b9bd7bb9ec94a2a6b24da5f664429bc66d98f222168f852e2ff10f1
Deleted: sha256:ae7b6fd444a49cd1ba46e90c34a90311774004d1ac7b49d87c82a775600b9e91
Deleted: sha256:d537a9eb20362a84b425c81864d3985e99caf0b039e71f436ca908724cb709eb
Deleted: sha256:c54008b25e48b50c1a7143c4a323fcc707b558e564148522775e0029d91ce524
Deleted: sha256:6ea9e8f3bcf07b8a358b7cae4dd1ea4e018bec505a2adbe6afb7bad4c6a36424
Deleted: sha256:c49c9b7de209f0513ce63912301c2fbc137a503241b065991c2d507af18a11c7
Deleted: sha256:ec6931c0ce7ea9657e8668370b5a891df41e370b0de4b13ae0cc77ff722a4e14
Deleted: sha256:1ff7385db5173ac5417693ff2719ca08828276f964f3396b53a2a62c81b9370c
Deleted: sha256:d68e59285c98f895b5471eef1a7363cb69800cca31a7c503de21ef3457565c69
Deleted: sha256:3737880d7e074ff3e049db7dbbcb68f42ff43bc4ed596f3091e8f35def31df84
Deleted: sha256:3bb9ef5f40f3469313c180c72dc1febbc5f0b394462ce163b45b37b0a3a6fbce
Deleted: sha256:fe4d37b3e0b9375b5e8581e453df556481a5a448d934ab5e75f8613e9b813d8a
Deleted: sha256:3e1832ff4654cb1adfade80aefd057485485bf15025db8525664d0892a22dc08
Deleted: sha256:e99ba411de8f479871ccfa553fa25b251c42822c5f9c4fd4536798f6aa564ef9
Deleted: sha256:1c46d4e326ac5b54513ca567cea147f52c0282b4a4c4f7c452d1bba3eb6971c9
Deleted: sha256:c0723547720e658450be4f6506752dcd17e6fabad3f4c35c98d69f3ba520a494
Deleted: sha256:43daa0e6291eb92050f0610ca58faed9797cd22b3c9ad3137f83aef2ec1a650e
Deleted: sha256:1c2fbaa263104285a33bb4e487bd5f94c4845f66bd02f5bb3b63d7cdae3b45e8
Deleted: sha256:ee3e7d17ae03658baacb4f73b1f12fa1d3648db05bfedbaf2d9f77d9a836df6f
Deleted: sha256:9fd2d5db6bee6578774bba9283e44a04b944bd64660be42d5a23a462b130c6da
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220324124845]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:23dd33b71af4dc30d27e6504abb9ad7a95ca776d900ce9bac2e911c239c6010d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220324124845] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:23dd33b71af4dc30d27e6504abb9ad7a95ca776d900ce9bac2e911c239c6010d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:431e20c7f35b829b1264d9a5c7ca20a96b3ca646d1f38bc88b764947685b23af
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:431e20c7f35b829b1264d9a5c7ca20a96b3ca646d1f38bc88b764947685b23af
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Thu, 24 Mar 2022 16:03:44 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:431e20c7f35b829b1264d9a5c7ca20a96b3ca646d1f38bc88b764947685b23af': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 15m 32s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/arkyy5ctm25ee

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 277 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 277 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/277/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #276

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/276/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-14124] Add display data to BQ storage reads.

[mmack] [adhoc] Move aws IT tests to testing package according to best practices

[noreply] fixes static checks and go lint issues (#17138)

[Kyle Weaver] Don't print in task configuration.

[noreply] [BEAM-14136] Clean up staticcheck and linter warnings in the Go SDK

[noreply] Merge pull request #17063 from [BEAM-12164] Fix flaky tests

[noreply] Revert "[BEAM-14112] Avoid storing a generator in _CustomBigQuerySource

[Kyle Weaver] [BEAM-4106] Remove filesToStage from Flink pipeline option list.

[noreply] [BEAM-14071] Enabling Flink on Dataproc for Interactive Beam (#17044)

[noreply] Minor: Bypass schema registry in schemas_test.py (#17108)


------------------------------------------
[...truncated 95.48 KB...]
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1397, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for spotbugs-annotations-4.0.6: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt after 9 retries.
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 1418, in connect
    super().connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1397, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for spotbugs-annotations-4.0.6: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html after 9 retries.
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for jFormatString-3.0.0: http://www.gnu.org/licenses/lgpl.html. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for jFormatString-3.0.0: http://www.gnu.org/licenses/lgpl.html. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 58, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for jFormatString-3.0.0: http://www.gnu.org/licenses/lgpl.html after 9 retries.
ERROR:root:['checkstyle-8.23', 'spotbugs-annotations-4.0.6', 'jFormatString-3.0.0']
ERROR:root:**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]
INFO:root:pull_licenses_java.py failed. It took 481.309235 seconds with 16 threads.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 321, in <module>
    raise RuntimeError('{n} error(s) occurred.'.format(n=len(error_msg)),
RuntimeError: ('1 error(s) occurred.', ['**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]'])

> Task :sdks:java:container:pullLicenses FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 8m 40s
103 actionable tasks: 66 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/avmkg5gaq7xym

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 275 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 275 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/275/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #274

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/274/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14122] Upgrade pip-licenses dependency (#17132)


------------------------------------------
[...truncated 1.03 MB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 20, 2022 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-20T16:03:28.474Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 20, 2022 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-20T16:03:28.520Z: Worker pool stopped.
Mar 20, 2022 4:03:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-20_05_45_31-13247473804466536619 finished with status CANCELLED.
Load test results for test (ID): 4081eae5-20e3-40ec-8c40-041db40a7155 and timestamp: 2022-03-20T12:45:26.539000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11571.711
dataflow_v2_java11_total_bytes_count             4.05289716E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220320124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7e7891b816923724eff10bca36c9d7437c0940ee6a8517eff86d77c7743e5783
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220320124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7e7891b816923724eff10bca36c9d7437c0940ee6a8517eff86d77c7743e5783]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220320124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7e7891b816923724eff10bca36c9d7437c0940ee6a8517eff86d77c7743e5783])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7e7891b816923724eff10bca36c9d7437c0940ee6a8517eff86d77c7743e5783
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7e7891b816923724eff10bca36c9d7437c0940ee6a8517eff86d77c7743e5783
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7e7891b816923724eff10bca36c9d7437c0940ee6a8517eff86d77c7743e5783].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86650a6a49b3352cd5317eae0bbb63e37a6144ec88e2989a4f6ff2aa27c37755
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:86650a6a49b3352cd5317eae0bbb63e37a6144ec88e2989a4f6ff2aa27c37755
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Sun, 20 Mar 2022 16:03:44 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:86650a6a49b3352cd5317eae0bbb63e37a6144ec88e2989a4f6ff2aa27c37755': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 22s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/6waipw3mk2a3u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #273

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/273/display/redirect?page=changes>

Changes:

[Kiley Sok] Add Java 17 Nexmark metrics to Grafana

[yiru] .

[yiru] .

[yiru] .

[yiru] format fix

[yiru] .

[yiru] make DoFn into a separate class

[yiru] .

[yiru] fix setting

[mmack] [adhoc] Minor cleanup for aws2 tests

[mmack] [BEAM-14125] Update website IO matrix to recommend aws2 IOs

[noreply] [BEAM-14128] Eliminating quadratic behavior of

[noreply] [BEAM-13972] Add RunInference interface (#16917)

[noreply] Merge pull request #17116 from [BEAM-12164] Remove change_stream in

[yiru] fix checkstyle

[yiru] spotlessapply

[noreply] Deprecate tags.go (#17025)

[noreply] [BEAM-12753] and [BEAM-12815] Fix Flink Integration Tests (#17067)

[noreply] Merge pull request #16895 from [BEAM-13882][Playground] More tests for

[noreply] [BEAM-13925] Add weekly automation to update our reviewer config

[noreply] Merge pull request #17076 from Beam 14082 update payground for mobile

[noreply] [BEAM-13925] Assign committers in the scheduled action (#17062)

[noreply] Pin setup-gcloud to v0 instead of master (#17123)

[noreply] [BEAM-3304] documentation for PaneInfo in BPG (#17047)

[noreply] Merge pull request #17016 from [BEAM-14049][Playground] Add new API

[noreply] Merge pull request #17077 from [BEAM-14078] [Website] change link

[noreply] Merge pull request #17085 from [BEAM-14077] [Website] add beam

[noreply] Update Changes.md w/Go pipeline pre-process fix.

[noreply] [BEAM-14098] wrapper for postgres on JDBC IO GO SDK (#17088)

[noreply] Merge pull request #17023 from [BEAM-12164]: Remove child partition


------------------------------------------
[...truncated 70.79 KB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Mar 19, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-19T12:53:07.871Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Mar 19, 2022 4:00:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-19T16:00:56.135Z: Cancel request is committed for workflow job: 2022-03-19_05_47_24-5863629577210895251.
Mar 19, 2022 4:00:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-19T16:00:56.205Z: Cleaning up.
Mar 19, 2022 4:00:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-19T16:00:56.285Z: Stopping **** pool...
Mar 19, 2022 4:00:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-19T16:00:56.331Z: Stopping **** pool...
Mar 19, 2022 4:03:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-19T16:03:26.974Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 19, 2022 4:03:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-19T16:03:27.012Z: Worker pool stopped.
Mar 19, 2022 4:03:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-19_05_47_24-5863629577210895251 finished with status CANCELLED.
Load test results for test (ID): 9ad8457e-0c66-4428-94b2-96540ca6eccb and timestamp: 2022-03-19T12:47:18.418000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11478.827
dataflow_v2_java11_total_bytes_count             3.84489071E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220319124442
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e86a4495494816abe21da39a5ca73fa60554abdb3f3af238548d4e70d5f8647c
Deleted: sha256:1298de5eb7d966731aad43b36c80ba13320ae3b599322fe304051992d15fe6ef
Deleted: sha256:5b063d86a9103d938e3da9bb9fb84afbc261b079064b2fda2ba79e23b3ff9195
Deleted: sha256:052eaeaeb83c75c05c2f7b8c8cfabf21a4e174051cc052104b9cdb1c92d6354f
Deleted: sha256:01ffaa4b2bd3fae1b11a47d2f500ef12570b78c8572b6563e5e5fce038461665
Deleted: sha256:3d0c68bdf589ee66fc5595024cdd2a6c8b0b60735565ff00dd1ff20021b16c87
Deleted: sha256:72c8cd5ae5c602e80fb86e4f69f83be2bfd57d1098d910e86231f3494ba70e2f
Deleted: sha256:08264aecae2257ed4a42898e2eedea7a3420b5a016f7c2318a4e8662ab6fc775
Deleted: sha256:027654594ade49d09f7b9db521dfe498d91650dfb663d547f2b80d12338f3664
Deleted: sha256:5d93eaeb8ab6b19d26e3d6e37208ccd59f34f8248c3644d49e58d645f9762f9e
Deleted: sha256:0a72533f76a5113fba3da2a3dc26fb4c1d8b1d1ebc81cfa0f0e962c4ff481f10
Deleted: sha256:e9a6161df81b362f62a131767338696c23de5f0f63c23ce03c0ff29159a64093
Deleted: sha256:0ab5d28255607cffdd639d82dfaf699699cf8fa1632c10bae413947f2225d4a4
Deleted: sha256:d3b378115e484072d01a2fbc216241608774175ee156acf7a40a4f331cc0fc10
Deleted: sha256:7688fe78b32544ece3000cf0db490fb7c1954105891dc3e9fd4ec339d5a693d3
Deleted: sha256:46c8143528278b408625c1b5500a635b93e60daeb688afb1ff21def5d1700bcc
Deleted: sha256:4083eb54b9e866cfbf2ee11fa685cb6f9b36ff1668583e5818e985c68379716f
Deleted: sha256:4fe7b6f971b08cb97e52f6b3e496a14f670288c2d708f1921d2b6af8f9077e18
Deleted: sha256:9e879f6a88e8a51bc0dbb72231602aaf7396eb4e35d3536c6f00af65f73e2fa7
Deleted: sha256:f7051bb901146e23c65acdbdd13421aedb58c2490a361663b105914037bd5019
Deleted: sha256:30bb46f8c49bd605dce7ed3db08832729fb9722fa7405e8fbfc054c34e5fe22e
Deleted: sha256:72d3081f8ad9038ed4237eb81247b742327386286ef2bd0b7530d7c811591287
Deleted: sha256:95b0bc70d8f49ec5c3861aa1ec8acb1a074add1d0c86ff6bc7bf227d6b55c543
Deleted: sha256:68e24f3813b2980a19815e3a06621661209a28b6c26b88c03d63aa4314d6eee4
Deleted: sha256:c59130b2a9546e766601d219c946adbe871b7c838c4d6c7262f6d017a236fe60
Deleted: sha256:575c0a4f07fbaf06e2edebf785a4e15ba55600f3f44f24849188170475269212
Deleted: sha256:1085342f9feefc4ef92862b7593398cb4a2d22f6f347877dde1a8d7e8c33c816
Deleted: sha256:7ba9352094444b07c4d698fbf0a3076e2f1046516506dfd235eed978514faaea
Deleted: sha256:e15012f1428eec0e33d6e33d32de3e6e05e22708b8e13703b5f8e118da0dbb5d
Deleted: sha256:4235a26604f41a0ecd9663c43994446e5b13988467ac80057da4d2481de3ec31
Deleted: sha256:12236b43581ad9330274d6ac158951c5f75c839ba22585c837470a4b818ef5a7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220319124442]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e86a4495494816abe21da39a5ca73fa60554abdb3f3af238548d4e70d5f8647c]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220319124442] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e86a4495494816abe21da39a5ca73fa60554abdb3f3af238548d4e70d5f8647c])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e86a4495494816abe21da39a5ca73fa60554abdb3f3af238548d4e70d5f8647c
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e86a4495494816abe21da39a5ca73fa60554abdb3f3af238548d4e70d5f8647c
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e86a4495494816abe21da39a5ca73fa60554abdb3f3af238548d4e70d5f8647c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 40s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/uvuay4e3xnyym

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #272

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/272/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10212] Clean-up comments, remove rawtypes usage.

[noreply] [BEAM-11934] Add enable_file_dynamic_sharding to allow DataflowRunner

[noreply] [BEAM-12777] Create symlink for `current` directory (#17105)

[noreply] [BEAM-14020] Adding SchemaTransform, SchemaTransformProvider,

[noreply] [BEAM-13015] Modify metrics to begin and reset to a non-dirty state.

[noreply] [BEAM-14112] Avoid storing a generator in _CustomBigQuerySource (#17100)

[noreply] Populate environment capabilities in v1beta3 protos. (#17042)

[Kyle Weaver] [BEAM-12976] Test a whole pipeline using projection pushdown in BQ IO.

[Kyle Weaver] [BEAM-12976] Enable projection pushdown for Java pipelines on Dataflow,

[noreply] [BEAM-14038] Auto-startup for Python expansion service. (#17035)

[Kyle Weaver] [BEAM-14123] Fix typo in hdfsIntegrationTest task name.

[noreply] [BEAM-13893] improved coverage of jobopts package (#17003)

[noreply] Merge pull request #16977 from [BEAM-12164]  Added integration test for


------------------------------------------
[...truncated 254.00 KB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
=
Mar 18, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-18T16:00:43.099Z: Cancel request is committed for workflow job: 2022-03-18_05_45_44-17152202566758054426.
Mar 18, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-18T16:00:43.157Z: Cleaning up.
Mar 18, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-18T16:00:43.238Z: Stopping **** pool...
Mar 18, 2022 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-18T16:00:43.288Z: Stopping **** pool...
Mar 18, 2022 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-18T16:03:00.899Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 18, 2022 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-18T16:03:00.947Z: Worker pool stopped.
Mar 18, 2022 4:03:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-18_05_45_44-17152202566758054426 finished with status CANCELLED.
Load test results for test (ID): 0120235c-9262-494b-98fd-b2b82f161c1a and timestamp: 2022-03-18T12:45:38.983000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11524.235
dataflow_v2_java11_total_bytes_count             3.21339149E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220318124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:cb1634ea83fac6b1ff77ef23a4baf6967f74532b4836d91e1757e5e47f5ba729
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220318124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:cb1634ea83fac6b1ff77ef23a4baf6967f74532b4836d91e1757e5e47f5ba729]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220318124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:cb1634ea83fac6b1ff77ef23a4baf6967f74532b4836d91e1757e5e47f5ba729])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:cb1634ea83fac6b1ff77ef23a4baf6967f74532b4836d91e1757e5e47f5ba729
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:cb1634ea83fac6b1ff77ef23a4baf6967f74532b4836d91e1757e5e47f5ba729
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:cb1634ea83fac6b1ff77ef23a4baf6967f74532b4836d91e1757e5e47f5ba729].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 55s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/zh5lodcqr3ioo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #271

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/271/display/redirect?page=changes>

Changes:

[noreply] Mapped JOB_STATE_RESOURCE_CLEANING_UP to State.RUNNING.

[ryanthompson591] fixed typo in typehints

[zyichi] Remove unused prebuild_sdk_container_base_iamge option from validate

[hengfeng] feat: add more custom metrics

[noreply] [BEAM-14103][Playgrounf][Bugfix] Fix google analytics id (#17092)

[noreply] Minor: Make ScopedReadStateSupplier final (#16992)

[noreply] [BEAM-14113] Improve SamzaJobServerDriver extensibility (#17099)

[noreply] [BEAM-14116] Chunk commit requests dynamically (#17004)

[noreply] Merge pull request #17079 from [BEAM-13660] Add types and queries in

[noreply] [BEAM-13888] Add unit testing to ioutilx (#17058)

[noreply] Merge pull request #16822 from [BEAM-13841][Playground] Add Application

[noreply] Minor: Make serializableCoder warning gramatically correct english

[noreply] [BEAM-14091] Fixing Interactive Beam show/collect for remote runners


------------------------------------------
[...truncated 74.06 KB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Mar 17, 2022 12:52:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-17T12:51:59.221Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:222
Mar 17, 2022 4:00:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-17T16:00:48.575Z: Cancel request is committed for workflow job: 2022-03-17_05_45_39-16131059750204942897.
Mar 17, 2022 4:00:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-17T16:00:48.679Z: Cleaning up.
Mar 17, 2022 4:00:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-17T16:00:48.787Z: Stopping **** pool...
Mar 17, 2022 4:00:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-17T16:00:48.910Z: Stopping **** pool...
Mar 17, 2022 4:03:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-17T16:03:18.170Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 17, 2022 4:03:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-17T16:03:18.287Z: Worker pool stopped.
Mar 17, 2022 4:03:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-17_05_45_39-16131059750204942897 finished with status CANCELLED.
Load test results for test (ID): b45e63b1-555b-43cc-94e3-398b9ca698ff and timestamp: 2022-03-17T12:45:34.188000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11569.069
dataflow_v2_java11_total_bytes_count             2.67311375E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220317124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5576cca73790d1bd546fa226cfe21a1f82448d7e0baf91018fa42a7ca440e43a
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220317124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5576cca73790d1bd546fa226cfe21a1f82448d7e0baf91018fa42a7ca440e43a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220317124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5576cca73790d1bd546fa226cfe21a1f82448d7e0baf91018fa42a7ca440e43a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5576cca73790d1bd546fa226cfe21a1f82448d7e0baf91018fa42a7ca440e43a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5576cca73790d1bd546fa226cfe21a1f82448d7e0baf91018fa42a7ca440e43a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5576cca73790d1bd546fa226cfe21a1f82448d7e0baf91018fa42a7ca440e43a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 8s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wdn5tpfrx4cr4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #270

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/270/display/redirect?page=changes>

Changes:

[Chamikara Madhusanka Jayalath] Updates x-lang release validation to use staged jars

[dhuntsperger] documented maven-to-gradle conversion for Dataflow; refactored java

[dhuntsperger] adding a list of example pipelines

[dhuntsperger] removing unnecessary `ls` command from maven project generation

[dhuntsperger] fixing filename formatting in response to feedback

[dhuntsperger] adding extra step emphasizing runner setupt

[dhuntsperger] reorganized instructions to emphasize setup steps for runners

[noreply] [BEAM-13767] Move a bunch of python tasks to use gradle configuration…

[noreply] Merge pull request #17052 from [BEAM-13818] [SnowflakeIO] Add support

[noreply] Adding pydoc for StateHandler (#17091)

[noreply] BEAM-3165 Bypass split if numSplit is zero (#17084)


------------------------------------------
[...truncated 880.70 KB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
Mar 16, 2022 1:48:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-16T13:48:56.870Z: Staged package gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar' is inaccessible.
Mar 16, 2022 1:48:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-16T13:48:57.407Z: Staged package google-cloud-spanner-6.17.4-NvG_91cO2RfpEp9bgh1dl-N3SCcaKXMAoFmtVb0nBzY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-spanner-6.17.4-NvG_91cO2RfpEp9bgh1dl-N3SCcaKXMAoFmtVb0nBzY.jar' is inaccessible.
Mar 16, 2022 1:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-16T13:48:59.837Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Mar 16, 2022 1:52:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-16T13:52:00.131Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Mar 16, 2022 1:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-16T13:54:56.869Z: Staged package gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar' is inaccessible.
Mar 16, 2022 1:54:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-16T13:54:57.458Z: Staged package google-cloud-spanner-6.17.4-NvG_91cO2RfpEp9bgh1dl-N3SCcaKXMAoFmtVb0nBzY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-spanner-6.17.4-NvG_91cO2RfpEp9bgh1dl-N3SCcaKXMAoFmtVb0nBzY.jar' is inaccessible.
Mar 16, 2022 1:55:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-16T13:55:00.081Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Mar 16, 2022 1:58:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-16T13:58:00.061Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Mar 16, 2022 2:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-16T14:00:57.304Z: Staged package gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gax-grpc-2.8.1-Oaqma5KQUeKekzhWvuQ9BCsF3Uf1NF_DCF0Um3fxw3s.jar' is inaccessible.
Mar 16, 2022 2:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-03-16T14:00:57.997Z: Staged package google-cloud-spanner-6.17.4-NvG_91cO2RfpEp9bgh1dl-N3SCcaKXMAoFmtVb0nBzY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-spanner-6.17.4-NvG_91cO2RfpEp9bgh1dl-N3SCcaKXMAoFmtVb0nBzY.jar' is inaccessible.
Mar 16, 2022 2:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-16T14:01:00.901Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Mar 16, 2022 2:04:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-16T14:04:03.250Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Mar 16, 2022 4:00:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:00:47.971Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:00:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:00:48.012Z: Cleaning up.
Mar 16, 2022 4:00:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:00:48.104Z: Stopping **** pool...
Mar 16, 2022 4:00:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:00:48.215Z: Stopping **** pool...
Mar 16, 2022 4:03:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:03:10.302Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 16, 2022 4:03:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:03:10.356Z: Worker pool stopped.
Mar 16, 2022 4:05:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:38.111Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:41.411Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:41.911Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:46.533Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:46.853Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:49.784Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:51.978Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:53.388Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:53.658Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:05:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:53.757Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:06:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:57.560Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:06:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-16T16:05:58.155Z: Cancel request is committed for workflow job: 2022-03-16_05_45_42-679718333454020510.
Mar 16, 2022 4:06:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-16_05_45_42-679718333454020510 finished with status CANCELLED.
Load test results for test (ID): f77d0ab5-13b0-4008-989b-98ebb8a1fc92 and timestamp: 2022-03-16T12:45:33.805000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11558.355
dataflow_v2_java11_total_bytes_count             2.72429876E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220316124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1d854f2e7879dd39a415b4b1fa926d7ac01321cbab8f391f24621cd7782b6e5e
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220316124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1d854f2e7879dd39a415b4b1fa926d7ac01321cbab8f391f24621cd7782b6e5e]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220316124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1d854f2e7879dd39a415b4b1fa926d7ac01321cbab8f391f24621cd7782b6e5e])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1d854f2e7879dd39a415b4b1fa926d7ac01321cbab8f391f24621cd7782b6e5e
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1d854f2e7879dd39a415b4b1fa926d7ac01321cbab8f391f24621cd7782b6e5e
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1d854f2e7879dd39a415b4b1fa926d7ac01321cbab8f391f24621cd7782b6e5e].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 22m 47s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ghi3mgsxhtxua

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 269 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 269 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/269/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #268

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/268/display/redirect>

Changes:


------------------------------------------
[...truncated 216.68 KB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
Mar 14, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-14T16:00:53.012Z: Cancel request is committed for workflow job: 2022-03-14_05_45_56-5460173703898451496.
Mar 14, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-14T16:00:53.257Z: Cleaning up.
Mar 14, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-14T16:00:53.417Z: Stopping **** pool...
Mar 14, 2022 4:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-14T16:00:53.474Z: Stopping **** pool...
Mar 14, 2022 4:03:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-14T16:03:18.632Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 14, 2022 4:03:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-14T16:03:18.691Z: Worker pool stopped.
Mar 14, 2022 4:03:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-14_05_45_56-5460173703898451496 finished with status CANCELLED.
Load test results for test (ID): 5b8bdbff-a6e4-425e-a55a-2c73a21d4866 and timestamp: 2022-03-14T12:45:50.392000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11566.43
dataflow_v2_java11_total_bytes_count             3.60536689E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220314124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5d12616a5668b321ef59a1a540aa2a6c2a2cb2d2c886e26214d82a90642a943d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220314124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5d12616a5668b321ef59a1a540aa2a6c2a2cb2d2c886e26214d82a90642a943d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220314124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5d12616a5668b321ef59a1a540aa2a6c2a2cb2d2c886e26214d82a90642a943d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5d12616a5668b321ef59a1a540aa2a6c2a2cb2d2c886e26214d82a90642a943d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5d12616a5668b321ef59a1a540aa2a6c2a2cb2d2c886e26214d82a90642a943d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5d12616a5668b321ef59a1a540aa2a6c2a2cb2d2c886e26214d82a90642a943d].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4494b5e74b9b3a018ef73ee9d3e1a5b7681e9f9d818a7a07978a20750a5f3529
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4494b5e74b9b3a018ef73ee9d3e1a5b7681e9f9d818a7a07978a20750a5f3529
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Mon, 14 Mar 2022 16:03:34 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:4494b5e74b9b3a018ef73ee9d3e1a5b7681e9f9d818a7a07978a20750a5f3529': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 12s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7rvtophbuhehu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #267

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/267/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14072] [BEAM-13993] [BEAM-10039] Import beam plugins before


------------------------------------------
[...truncated 456.24 KB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
 
Mar 13, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-13T16:00:42.315Z: Cancel request is committed for workflow job: 2022-03-13_05_45_32-8871529651542912021.
Mar 13, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-13T16:00:42.378Z: Cleaning up.
Mar 13, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-13T16:00:42.482Z: Stopping **** pool...
Mar 13, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-13T16:00:42.557Z: Stopping **** pool...
Mar 13, 2022 4:03:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-13T16:03:06.530Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 13, 2022 4:03:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-13T16:03:06.585Z: Worker pool stopped.
Mar 13, 2022 4:03:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-13_05_45_32-8871529651542912021 finished with status CANCELLED.
Load test results for test (ID): 8e24510b-abdf-477f-91b7-a23ff6710235 and timestamp: 2022-03-13T12:45:27.499000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11572.546
dataflow_v2_java11_total_bytes_count             3.08969986E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220313124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5247b876acf8c8f2f9525f7770d2cb5f59b0fb030802e5bb5c9f6a5b5d397a7a
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220313124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5247b876acf8c8f2f9525f7770d2cb5f59b0fb030802e5bb5c9f6a5b5d397a7a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220313124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5247b876acf8c8f2f9525f7770d2cb5f59b0fb030802e5bb5c9f6a5b5d397a7a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5247b876acf8c8f2f9525f7770d2cb5f59b0fb030802e5bb5c9f6a5b5d397a7a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5247b876acf8c8f2f9525f7770d2cb5f59b0fb030802e5bb5c9f6a5b5d397a7a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5247b876acf8c8f2f9525f7770d2cb5f59b0fb030802e5bb5c9f6a5b5d397a7a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 57s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/erzttcxzvzb6g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #266

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/266/display/redirect?page=changes>

Changes:

[Ismaël Mejía] [BEAM-13981] Remove Spark Runner specific code for event logging

[vitaly.terentyev] [BEAM-2766] Support null key/values in HadoopFormatIO

[vitaly.terentyev] [BEAM-2766] Fix checkstyle


------------------------------------------
[...truncated 1.62 MB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 12, 2022 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-12T16:03:26.969Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 12, 2022 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-12T16:03:27.039Z: Worker pool stopped.
Mar 12, 2022 4:03:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-12_04_45_38-3653063538555197996 finished with status CANCELLED.
Load test results for test (ID): 430f0a85-9a19-416c-8c46-c35f58262c91 and timestamp: 2022-03-12T12:45:31.816000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11575.219
dataflow_v2_java11_total_bytes_count             3.25449609E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220312124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5227d288c5495705e185f7204d7277e29d5378ccdb41c6d8780d24c1f5552058
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220312124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5227d288c5495705e185f7204d7277e29d5378ccdb41c6d8780d24c1f5552058]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220312124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5227d288c5495705e185f7204d7277e29d5378ccdb41c6d8780d24c1f5552058])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5227d288c5495705e185f7204d7277e29d5378ccdb41c6d8780d24c1f5552058
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5227d288c5495705e185f7204d7277e29d5378ccdb41c6d8780d24c1f5552058
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5227d288c5495705e185f7204d7277e29d5378ccdb41c6d8780d24c1f5552058].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 31s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/htaiq7oowr536

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #265

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/265/display/redirect?page=changes>

Changes:

[ihr] [BEAM-13923] Fix the answers placeholders locations in the Java katas

[jakub.kukul] [BEAM-14039] Propagate ignore_unknown_columns parameter.

[stranniknm] [BEAM-14079] playground - improve accessibility

[noreply] [BEAM-13925] Find and address prs that havent been reviewed in a week

[noreply] Fix import path

[noreply] [BEAM-13925] Fix one more import path

[noreply] Add a StatefulDoFn test that sets event time timer within allowed

[noreply] Merge pull request #17056 from [BEAM-14076] [SnowflakeIO] Add support


------------------------------------------
[...truncated 557.93 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 11, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-11T16:00:36.647Z: Cancel request is committed for workflow job: 2022-03-11_04_46_03-9236443180827163991.
Mar 11, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-11T16:00:36.676Z: Cleaning up.
Mar 11, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-11T16:00:36.780Z: Stopping **** pool...
Mar 11, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-11T16:00:36.824Z: Stopping **** pool...
Mar 11, 2022 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-11T16:03:06.081Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 11, 2022 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-11T16:03:06.222Z: Worker pool stopped.
Mar 11, 2022 4:03:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-11_04_46_03-9236443180827163991 finished with status CANCELLED.
Load test results for test (ID): 6f4cd587-af50-4e63-896b-6a479b72ac13 and timestamp: 2022-03-11T12:45:58.175000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11490.14
dataflow_v2_java11_total_bytes_count             2.42759814E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220311124338
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2b41c7587cde6ddc81276af420b2b20bfb8c4543dc74d0729ff4ef3fdd1b875
Deleted: sha256:f10d0226981b9d0cc0429da5ecf126a2c09396c76cb78278f14c0a145ca42a0a
Deleted: sha256:9401d34540f6f07f90bf246d9b2da67ac6883a110f8ab863f82cb5a76f10cc1e
Deleted: sha256:aa5384f9874c3e86d37185f893bef797b3f78c923a045a4b0258c2adf4e326a5
Deleted: sha256:07aa5342bdf33cdac34ae38179a95852e85b0993e0a3aea2f8be30956dbf87e6
Deleted: sha256:bf5da7ffc2cd2e662761217fb007c5f220ed12e5911c79243016cfecb7bbabf2
Deleted: sha256:85e24060179450808dc7e54e870de0d7fc130a35723748a40b19a8af1500aa41
Deleted: sha256:917ea2af63da18a0ade501209d9329a813ddfabe01e14a3b26ea47893ad8c4e9
Deleted: sha256:594b0575b8f915fb393d5c96f14c7df18c3d6393f7014fd12bbdca93a9a47a97
Deleted: sha256:9d7112d13acc9cca32caa125bed29863f565d2ae7a424d1fd711e83cbcc69b83
Deleted: sha256:e4914cd5c377848ee8714cf8c1c385d45bbc80aa68f30a90808a72e0a4e6712d
Deleted: sha256:7cba1f8192e5ef6eb28fb69d3d9ef0fce454587b7b7936909cbe5970122a0232
Deleted: sha256:779946275bf20c0aee49ef0b8d8e90950f268e36f3747f4050a908e6b139f66f
Deleted: sha256:e70a24e32b6c9e5a064885821ed506c53558ea6b2dbb25cd3e8407aa28ef0698
Deleted: sha256:f51b602e59cab92b31c23b1a9bc6df54e4ed0cd3ffe93234c5e1c49597edcefe
Deleted: sha256:5b2e43f6512e3fba737d043c7c66130cc8fbff36c38cf5b560257f01a2bf9427
Deleted: sha256:2ce536121832da3d04168f9177d92a315bbabeb584c80ca8ecf3bb45664f7c22
Deleted: sha256:bd8271b8a4de91c21c4182fadee5c4d3ea908d171c03b555e5bddfa671898b6b
Deleted: sha256:55e8b5d567d72a7b5957f5809ec2be36c6984dbadb9f431dc5d476deefbebc29
Deleted: sha256:9c777fb628f148411827f3960a026be2a1874a0993d6f50b6e47c0a6a472ee2a
Deleted: sha256:d3cdae62de540fe4ce507940bbed0c6cd4be138e1997553fd0f2fbbd2a1847bb
Deleted: sha256:b7cb683bb7b0270b7b9fe241537e2e1da5edc377ece12d2ee1b48e2a64f0d0f6
Deleted: sha256:c58ac11dafd16a1a24aa2b19444fd34319d74f821116821891be4866c860f367
Deleted: sha256:ebd2d06f169ad6acea33b845ac4348367f9883903e5925baa5e680e9fe703c18
Deleted: sha256:e56c6c2317c7d74190fc0110b215fc6f5f04cefeba59cc9848efc2dd9652561b
Deleted: sha256:ac9dc628ba11f60f6cf67c5fce7b73ba05c504316379cda360b3a1ffed6d490c
Deleted: sha256:aa40cfc476a54cd6ef3740606cc761a073945675adc7072828df650f7c1c596f
Deleted: sha256:e12d8a330a7981c730784e15c01d70e5eea6ab4b58d9dcc8e8d5149e3f7d7a16
Deleted: sha256:d275f512fc071e7016c1376050614d84181afbf8c87ea570779d6a3e0ecd85c7
Deleted: sha256:9a9f1f28c1b92420a60bc910316bb1d1fce8836cdc55946a8635e92edec0d6dc
Deleted: sha256:3d8555dc3c36afbdcbca620fdc14a5c8f4b548f8b0ed3df45cdfd17ff36efbc9
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220311124338]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2b41c7587cde6ddc81276af420b2b20bfb8c4543dc74d0729ff4ef3fdd1b875]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220311124338] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2b41c7587cde6ddc81276af420b2b20bfb8c4543dc74d0729ff4ef3fdd1b875])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2b41c7587cde6ddc81276af420b2b20bfb8c4543dc74d0729ff4ef3fdd1b875
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2b41c7587cde6ddc81276af420b2b20bfb8c4543dc74d0729ff4ef3fdd1b875
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2b41c7587cde6ddc81276af420b2b20bfb8c4543dc74d0729ff4ef3fdd1b875].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/xrfiwq3dqbnr4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #264

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/264/display/redirect?page=changes>

Changes:

[hengfeng] [BEAM-12164]: display the metadata table's name on UI

[noreply] Revert "[BEAM-13993] [BEAM-10039] Import beam plugins before starting

[noreply] Merge pull request #17036 from [BEAM-12164] Convert all static instances

[noreply] fix variable reference (#16991)

[noreply] Merge pull request #16844 from [BEAM-12164]: allow for nanosecond

[noreply] [BEAM-13904] Increase unit testing in the reflectx package (#17024)


------------------------------------------
[...truncated 669.24 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 10, 2022 4:02:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-10T16:02:42.341Z: Cancel request is committed for workflow job: 2022-03-10_05_00_32-10985606442155614844.
Mar 10, 2022 4:02:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-10T16:02:42.459Z: Cleaning up.
Mar 10, 2022 4:02:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-10T16:02:42.529Z: Stopping **** pool...
Mar 10, 2022 4:02:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-10T16:02:42.573Z: Stopping **** pool...
Mar 10, 2022 4:05:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-10T16:05:02.136Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 10, 2022 4:05:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-10T16:05:02.243Z: Worker pool stopped.
Mar 10, 2022 4:05:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-10_05_00_32-10985606442155614844 finished with status CANCELLED.
Load test results for test (ID): e01761eb-f609-4a85-ba75-2738fd3b8bb9 and timestamp: 2022-03-10T13:00:21.776000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 10776.355
dataflow_v2_java11_total_bytes_count             2.78719993E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220310124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:555b9421ea5215b87e705548b8cdc5caf69a0eb96b7d57be541cbba844b757a7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220310124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:555b9421ea5215b87e705548b8cdc5caf69a0eb96b7d57be541cbba844b757a7]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220310124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:555b9421ea5215b87e705548b8cdc5caf69a0eb96b7d57be541cbba844b757a7])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:555b9421ea5215b87e705548b8cdc5caf69a0eb96b7d57be541cbba844b757a7
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:555b9421ea5215b87e705548b8cdc5caf69a0eb96b7d57be541cbba844b757a7
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:555b9421ea5215b87e705548b8cdc5caf69a0eb96b7d57be541cbba844b757a7].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 22m 1s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/nz3j5fbblk5gg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #263

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/263/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Update dataflow API client.

[Robert Bradshaw] Instructions for updating apitools generated files.

[noreply] Merge pull request #17027: [BEAM-11205] Upgrade GCP Libraries BOM

[noreply] [BEAM-13709] Inconsistent behavior when parsing boolean flags across

[noreply] [BEAM-10976] Bundle finalization: Harness and some exec changes (#16980)

[noreply] Merge pull request #16976 from [BEAM-14010] [Website] Add Playground

[noreply] [BEAM-12447] Upgrade cloud build client and add/cleanup options (#17032)


------------------------------------------
[...truncated 372.20 KB...]
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/
Mar 09, 2022 4:03:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-09T16:03:15.518Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 09, 2022 4:03:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-09T16:03:15.562Z: Worker pool stopped.
Mar 09, 2022 4:03:22 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-09_04_45_58-9203528485196006878 finished with status CANCELLED.
Load test results for test (ID): 300e4642-aa5d-458d-9487-7a28056bfffa and timestamp: 2022-03-09T12:45:51.315000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11553.017
dataflow_v2_java11_total_bytes_count             3.29466078E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220309124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:60df3fbc671c0aa141df06c2cb8fcd159416f14744ed4805212b4b2ff2a70338
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220309124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:60df3fbc671c0aa141df06c2cb8fcd159416f14744ed4805212b4b2ff2a70338]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220309124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:60df3fbc671c0aa141df06c2cb8fcd159416f14744ed4805212b4b2ff2a70338])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:60df3fbc671c0aa141df06c2cb8fcd159416f14744ed4805212b4b2ff2a70338
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:60df3fbc671c0aa141df06c2cb8fcd159416f14744ed4805212b4b2ff2a70338
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:60df3fbc671c0aa141df06c2cb8fcd159416f14744ed4805212b4b2ff2a70338].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 10s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/bxail42u3bp2m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #262

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/262/display/redirect?page=changes>

Changes:

[dannymccormick] [BEAM-11085] Test that windows are correctly observed in DoFns

[jrmccluskey] [BEAM-14050] Update taxi.go example instructions

[noreply] Give pr bot write permissions on pr update

[noreply] Adding a logical type for Schemas using proto serialization. (#16940)

[noreply] BEAM-13765 missing PAssert methods (#16668)

[noreply] [BEAM-13909] improve coverage of Provision package (#17014)


------------------------------------------
[...truncated 781.14 KB...]
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retrie
Mar 08, 2022 4:10:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-08T16:10:08.400Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 08, 2022 4:10:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-08T16:10:08.469Z: Worker pool stopped.
Mar 08, 2022 4:10:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-08_04_46_04-10577037847904562581 finished with status CANCELLED.
Load test results for test (ID): cb5cd0cc-8bec-443d-9f30-da6b54dc9b87 and timestamp: 2022-03-08T12:45:57.871000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11955.008
dataflow_v2_java11_total_bytes_count             3.46559792E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220308124342
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec1e18cf16f754bdcebcd7ea4ebfcf02941ed05055e55bd7c83d65e01e41604c
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220308124342]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec1e18cf16f754bdcebcd7ea4ebfcf02941ed05055e55bd7c83d65e01e41604c]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220308124342] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec1e18cf16f754bdcebcd7ea4ebfcf02941ed05055e55bd7c83d65e01e41604c])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec1e18cf16f754bdcebcd7ea4ebfcf02941ed05055e55bd7c83d65e01e41604c
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec1e18cf16f754bdcebcd7ea4ebfcf02941ed05055e55bd7c83d65e01e41604c
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec1e18cf16f754bdcebcd7ea4ebfcf02941ed05055e55bd7c83d65e01e41604c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 26m 59s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/w4ujzbz7okssg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #261

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/261/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13925] Add ability to get metrics on pr-bot performance (#16985)


------------------------------------------
[...truncated 1.23 MB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 07, 2022 4:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-07T16:03:56.468Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 07, 2022 4:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-07T16:03:56.515Z: Worker pool stopped.
Mar 07, 2022 4:04:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-07_04_45_43-10570774633342370631 finished with status CANCELLED.
Load test results for test (ID): 062c5ac6-6462-44d9-a778-07bdb02a55d0 and timestamp: 2022-03-07T12:45:38.634000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11598.933
dataflow_v2_java11_total_bytes_count             3.25037501E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220307124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bd0f158886e39d6e5c69692fef238b84598a19a3468013bcc26998f8e64f5f89
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220307124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bd0f158886e39d6e5c69692fef238b84598a19a3468013bcc26998f8e64f5f89]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220307124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bd0f158886e39d6e5c69692fef238b84598a19a3468013bcc26998f8e64f5f89])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bd0f158886e39d6e5c69692fef238b84598a19a3468013bcc26998f8e64f5f89
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bd0f158886e39d6e5c69692fef238b84598a19a3468013bcc26998f8e64f5f89
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bd0f158886e39d6e5c69692fef238b84598a19a3468013bcc26998f8e64f5f89].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 49s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/qmxvtleyvaf2i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 260 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 260 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/260/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #259

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/259/display/redirect?page=changes>

Changes:

[rahuliyer573] py: Import beam plugins before starting SdkHarness

[stephen.patel] BEAM-14011 fix s3 filesystem multipart copy

[Valentyn Tymofieiev] Bump numpy bound to include 1.22 and regenerate container deps.

[github-actions] [BEAM-13925] months in date constructor are 0 indexed

[noreply] Merge pull request #16842 from [BEAM-13932][Playground] Container's user

[noreply] Doc updates and blog post for 2.37.0 (#16887)

[noreply] Remove resolved issue in docs + update class path on sample (#17018)

[noreply] [BEAM-14016] Fixed flaky postcommit test (#17009)

[noreply] Remove resolved issue in notebook

[noreply] [BEAM-13947] Add split() and rsplit(), non-deferred column operations on

[noreply] BEAM-14026 - Fixes bug related to Unnesting nested rows in an array


------------------------------------------
[...truncated 772.92 KB...]
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 05, 2022 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-05T16:03:29.161Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 05, 2022 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-05T16:03:29.213Z: Worker pool stopped.
Mar 05, 2022 4:03:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-05_04_45_57-13308798815759132861 finished with status CANCELLED.
Load test results for test (ID): 84d8e531-5259-4aed-bd4b-66545d5e14c7 and timestamp: 2022-03-05T12:45:52.717000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11537.986
dataflow_v2_java11_total_bytes_count             3.18738023E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220305124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7044ff27c0dcf836b665d127e6a0044c16176ff18c660f35168eb52d6a447100
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220305124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7044ff27c0dcf836b665d127e6a0044c16176ff18c660f35168eb52d6a447100]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220305124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7044ff27c0dcf836b665d127e6a0044c16176ff18c660f35168eb52d6a447100])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4312f1222026043031e5c022ffa83a27d3c88c75dd543fa7bbe0f48ef5aad08
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4312f1222026043031e5c022ffa83a27d3c88c75dd543fa7bbe0f48ef5aad08
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Sat, 05 Mar 2022 16:03:42 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:f4312f1222026043031e5c022ffa83a27d3c88c75dd543fa7bbe0f48ef5aad08': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 19s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/e6sdgswnbxmzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #258

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/258/display/redirect?page=changes>

Changes:

[stranniknm] [BEAM-13999] playground - support vertical orientation for graph

[noreply] [adhoc] Prepare aws2 ClientConfiguration for json serialization and

[noreply] Merge pull request #16879 from [BEAM-12164] Add javadocs to

[noreply] [Cleanup] Update pre-v2 go package references (#17002)

[noreply] [BEAM-13885] Add unit tests to window package (#16971)

[noreply] Merge pull request #16891 from [BEAM-13872] [Playground] Increase test

[noreply] Merge pull request #16912 from [BEAM-13878] [Playground] Increase test

[noreply] Merge pull request #16946 from [BEAM-13873] [Playground] Increase test

[noreply] [BEAM-13951] Update mass_comment.py list of Run commands (#16889)

[noreply] [BEAM-10652] Allow Clustering without Partition in BigQuery (#16578)

[noreply] [BEAM-13857] Add K:V flags for expansion service jars and addresses to


------------------------------------------
[...truncated 1.11 MB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item re
Mar 04, 2022 4:03:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-04T16:03:47.907Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 04, 2022 4:03:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-04T16:03:47.945Z: Worker pool stopped.
Mar 04, 2022 4:03:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-04_04_45_49-15483542717043833007 finished with status CANCELLED.
Load test results for test (ID): 7427f78b-52af-48e6-98cf-6ba3094ac1b6 and timestamp: 2022-03-04T12:45:43.646000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11549.654
dataflow_v2_java11_total_bytes_count             3.22753043E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220304124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:158baf01a5f8f17310615b14715d3efc87fef6a9e97575436e0f3dec6a8bc8d0
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220304124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:158baf01a5f8f17310615b14715d3efc87fef6a9e97575436e0f3dec6a8bc8d0]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220304124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:158baf01a5f8f17310615b14715d3efc87fef6a9e97575436e0f3dec6a8bc8d0])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:158baf01a5f8f17310615b14715d3efc87fef6a9e97575436e0f3dec6a8bc8d0
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:158baf01a5f8f17310615b14715d3efc87fef6a9e97575436e0f3dec6a8bc8d0
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:158baf01a5f8f17310615b14715d3efc87fef6a9e97575436e0f3dec6a8bc8d0].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 43s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cvj6tr56w27a4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #257

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/257/display/redirect?page=changes>

Changes:

[Alexey Romanenko] Bump org.mongodb:mongo-java-driver to 3.12.10

[noreply] [BEAM-13973] Link Dataproc Flink master URLs to the InteractiveRunner

[noreply] [BEAM-13925] Turn pr bot on for go prs (#16984)

[Pablo Estrada] Skipping flaky sad-path tests for Spanner changestreams

[noreply] [BEAM-13964] Bump kotlin to 1.6.x (#16882)

[noreply] Merge pull request #16906: [BEAM-13974] Handle idle Storage Api streams

[noreply] Merge pull request #16562 from [BEAM-13051][D] Enable pylint warnings

[noreply] [BEAM-13925] A couple small pr-bot bug fixes (#16996)

[noreply] [BEAM-14029] Add getter, setter for target maven repo (#16995)

[noreply] [BEAM-13903] Improve coverage of metricsx package (#16994)

[noreply] [BEAM-13892] Improve coverage of avroio package (#16990)


------------------------------------------
[...truncated 48.49 KB...]
6d5bdee4481e: Preparing
3ff7f1b89814: Preparing
5feba7d0afd4: Preparing
da814db69f74: Preparing
804bc49f369a: Preparing
d1609e012401: Preparing
e3f84a8cee1f: Preparing
48144a6f44ae: Preparing
26d5108b2cba: Preparing
89fda00479fc: Preparing
ef71ca23d831: Waiting
3ff7f1b89814: Waiting
7d86b0fa5875: Waiting
5feba7d0afd4: Waiting
804bc49f369a: Waiting
da814db69f74: Waiting
d1609e012401: Waiting
48144a6f44ae: Waiting
e3f84a8cee1f: Waiting
d69ddd45c633: Waiting
18654c3dc7ba: Waiting
6d5bdee4481e: Waiting
26d5108b2cba: Waiting
63371a337244: Pushed
3ee33d36611d: Pushed
835b72d70440: Pushed
0af0e0fbfdb1: Pushed
ef71ca23d831: Pushed
55fc7aada96a: Pushed
18654c3dc7ba: Pushed
7bc364698301: Pushed
3ff7f1b89814: Pushed
d69ddd45c633: Pushed
7d86b0fa5875: Pushed
da814db69f74: Layer already exists
804bc49f369a: Layer already exists
d1609e012401: Layer already exists
e3f84a8cee1f: Layer already exists
48144a6f44ae: Layer already exists
26d5108b2cba: Layer already exists
89fda00479fc: Layer already exists
5feba7d0afd4: Pushed
6d5bdee4481e: Pushed
20220303124459: digest: sha256:d37cdf8623963e5234c0de2018c9b92542a7ece319702a285cbaa30459359b96 size: 4520

> Task :sdks:java:testing:load-tests:run
Mar 03, 2022 12:49:12 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Mar 03, 2022 12:49:15 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 206 files. Enable logging at DEBUG level to see which files will be staged.
Mar 03, 2022 12:49:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Mar 03, 2022 12:49:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Mar 03, 2022 12:49:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 206 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Mar 03, 2022 12:49:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 206 files cached, 0 files newly uploaded in 4 seconds
Mar 03, 2022 12:49:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Mar 03, 2022 12:49:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114762 bytes, hash 1db0f3546f1bbaa8d1cdd2b82f904391b3a3ebe8e31e671ea204805e6f814f0b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HbDzVG8buqjRzdK4L5BDkbOj6-jjHmceogSAXm-BTws.pb
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Mar 03, 2022 12:49:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@625dfff3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26350ea2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e9469b8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a08efdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57272109, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59696551, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@648d0e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79e66b2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17273273, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f69e2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@984169e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f1bb92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199]
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Mar 03, 2022 12:49:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30c1da48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a65cd8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f1ef9d6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17461db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fd9e827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e682398, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@670b3ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24a86066, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54402c04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5b3bb1f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58d6b7b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f1a4795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a6f6c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c5ddccd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dbd580, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c101cc1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d0d91a1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb48179, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@201c3cda]
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Mar 03, 2022 12:49:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.38.0-SNAPSHOT
Mar 03, 2022 12:49:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-03_04_49_33-17973105198643950507?project=apache-beam-testing
Mar 03, 2022 12:49:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-03-03_04_49_33-17973105198643950507
Mar 03, 2022 12:49:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-03-03_04_49_33-17973105198643950507
Mar 03, 2022 12:49:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-03-03T12:49:42.331Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-03-lwbl. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:48.012Z: Worker configuration: e2-standard-2 in us-central1-b.
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:48.875Z: Expanding SplittableParDo operations into optimizable parts.
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:48.903Z: Expanding CollectionToSingleton operations into optimizable parts.
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:48.971Z: Expanding CoGroupByKey operations into optimizable parts.
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.042Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.076Z: Expanding GroupByKey operations into streaming Read/Write steps
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.146Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.258Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.287Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Mar 03, 2022 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.322Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.349Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.386Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.417Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.454Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.479Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.511Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.537Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.607Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.644Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.667Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.695Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.729Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.760Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.792Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.817Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.841Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.862Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.887Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.918Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:49.940Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:50.108Z: Running job using Streaming Engine
Mar 03, 2022 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:49:50.365Z: Starting 5 ****s in us-central1-b...
Mar 03, 2022 12:50:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:50:02.531Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Mar 03, 2022 12:50:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:50:36.488Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Mar 03, 2022 12:51:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:51:31.556Z: Workers have started successfully.
Mar 03, 2022 12:51:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T12:51:31.608Z: Workers have started successfully.
Mar 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T16:01:02.889Z: Cancel request is committed for workflow job: 2022-03-03_04_49_33-17973105198643950507.
Mar 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T16:01:02.990Z: Cleaning up.
Mar 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T16:01:03.079Z: Stopping **** pool...
Mar 03, 2022 4:01:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T16:01:03.158Z: Stopping **** pool...
Mar 03, 2022 4:03:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T16:03:21.663Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 03, 2022 4:03:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-03T16:03:21.731Z: Worker pool stopped.
Mar 03, 2022 4:03:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-03_04_49_33-17973105198643950507 finished with status CANCELLED.
Load test results for test (ID): dd23e934-54a7-4127-b2e8-f19371e21ede and timestamp: 2022-03-03T12:49:17.120000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11308.384
dataflow_v2_java11_total_bytes_count             2.62453723E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220303124459
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d37cdf8623963e5234c0de2018c9b92542a7ece319702a285cbaa30459359b96
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220303124459]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d37cdf8623963e5234c0de2018c9b92542a7ece319702a285cbaa30459359b96]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220303124459] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d37cdf8623963e5234c0de2018c9b92542a7ece319702a285cbaa30459359b96])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d37cdf8623963e5234c0de2018c9b92542a7ece319702a285cbaa30459359b96
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d37cdf8623963e5234c0de2018c9b92542a7ece319702a285cbaa30459359b96
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d37cdf8623963e5234c0de2018c9b92542a7ece319702a285cbaa30459359b96].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 41s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/6mb7mnivhcd4y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #256

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/256/display/redirect?page=changes>

Changes:

[egalpin] Use default context output rather than outputWithTimestamp for

[stranniknm] Palo Alto case study - fix link

[rogelio.hernandez] [BEAM-12777] Removed current docs version redirect

[noreply] Merge pull request #16850: [BEAM-11205] Upgrade Libraries BOM

[noreply] Merge pull request #16484 from [BEAM-13633] [Playground] Implement

[noreply] Add 2022 events blog post (#16975)

[noreply] Clean up Go formatter suggestions (#16973)

[noreply] [BEAM-14012] Add go fmt to Github Actions (#16978)

[noreply] [BEAM-13911] Add basic tests to Go direct runner. (#16979)

[noreply] [BEAM-13960] Add support for more types when converting from between row


------------------------------------------
[...truncated 593.69 KB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Mar 02, 2022 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-02T16:03:07.883Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 02, 2022 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-02T16:03:07.918Z: Worker pool stopped.
Mar 02, 2022 4:03:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-02_04_48_02-8114063887973567786 finished with status CANCELLED.
Load test results for test (ID): b5211415-29b2-43ad-a40b-3845eb907dcf and timestamp: 2022-03-02T12:47:46.150000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11417.075
dataflow_v2_java11_total_bytes_count             2.48113054E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220302124420
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f1f92d7a674fc4135c4a6f1f52f27796f6a52ac0af8c261a1fed35852a267307
Deleted: sha256:a5736b5024481f8e1bc7254972840da6a9ccee11eed2ffb027f3e8365fb3386e
Deleted: sha256:4138254ee35e6d79fc3f7fdb818edccd7588e9582f3f8fe1cea4d86585357c83
Deleted: sha256:aeb2340e1e813aec3114d76d5ff42d0e357e165cad7074418f0fda92c3de9e6d
Deleted: sha256:e142f8997cca4e6d633298f2deb152b95b2bec9394d3295bdd9815f5a56d9efc
Deleted: sha256:531851ee16f512d501e430ac3a365cb93f8ab672e3a6faa332c247569d47334f
Deleted: sha256:aec91832373af2de627a6e28864934a412d31887030162d9a9acd3430372cb46
Deleted: sha256:ace0e7d55d7a0a2c86c881abc8b18e5f803cc6be53500d39417182abda3fb077
Deleted: sha256:ff2a621d19d1b897bfed273dfdd79802176e335a3d3847e7deb4cd6156ff9062
Deleted: sha256:ec6f0fda68e5a4df70a7dfabe8da30473f10b4d7be5804cbc35f8011577ce131
Deleted: sha256:2fb32fdb89b63481e577828014aa54edf5399ed06904a62c7a559f91b637572d
Deleted: sha256:2b53875223490ae9e104d029e5e355281ec46fa12938d68a89c9a4174b5a33b1
Deleted: sha256:3266ee46048eaca3ea60a5bbb4b01c4cc5b6d2bf4641233a49d443eaf082dd1a
Deleted: sha256:c2807580bea0008baef4e6e51fdea18f5f2ebaa96ce3ed5410f6b24d27a0d112
Deleted: sha256:ba9f93fa191a731a8c08eaeb029eea09d1a77514ecb0cf85b240b7b307607106
Deleted: sha256:db2bc0d10636873d2faece85c3399220db38eb3bd9d99e3f1e9613111b209593
Deleted: sha256:98b7f7cd45d9f8dc3510fd58e7a433282e4491fd3fd6e5a150b125e9a89b2766
Deleted: sha256:2a724a490b831b3393b538695dca9c2b6a70718cd9d30349295f9943cd53f47a
Deleted: sha256:d432e01cdf3fe1037d83058c1f37157f6d3c366042c6da0e6e18b671f220a8a9
Deleted: sha256:24cb3c316e5a83fec0f813cd9695e178da4d325b23fe5cfec04d75d12cacfb8a
Deleted: sha256:08d91b1875b1fcf2e40152968d586a2880774f8294159d3c65a7e5de99ab2d4a
Deleted: sha256:d1128c2ca376bc901f91a9e948d8a333b20edd3193ad88a1ddfe3e18c04caca7
Deleted: sha256:ccf90571aa92375751daa649c5510cbbb3b5c9f3156a25209e88e017605209f8
Deleted: sha256:89781e66168fe766594bde69b0cd04ebd8580b08e45862cd9145a5b5264ab306
Deleted: sha256:d7765a1f1a6adbdaa180595b256a351001493532b2cbdb5e50d12558d58750ec
Deleted: sha256:2b023ba53e31bcb7815f1ff064d5ec5fddd2ee1c94ed94da3f869a5f4100150a
Deleted: sha256:82f687192f5f1aecd4e0ef06aa133e75cb51235e2aab3cfddf1be3cae31c9be6
Deleted: sha256:78e97d4ac63f2ddf115f7ebdbba254de17bf167a5a7cb691bca9c512d3fc5fd1
Deleted: sha256:801adf499529cbda4ab0ee2e011e0a9c09844f4a0e029db5358e93b8181862e4
Deleted: sha256:195ac04207fe714c6a69ea2c3f100b578aad963b0b2560efea16457ba1122c81
Deleted: sha256:4daaf229d0f84521cd9bd08735a1770d40ef6580fcf5c5908b25db29fc97c1fa
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220302124420]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f1f92d7a674fc4135c4a6f1f52f27796f6a52ac0af8c261a1fed35852a267307]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220302124420] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f1f92d7a674fc4135c4a6f1f52f27796f6a52ac0af8c261a1fed35852a267307])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f1f92d7a674fc4135c4a6f1f52f27796f6a52ac0af8c261a1fed35852a267307
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f1f92d7a674fc4135c4a6f1f52f27796f6a52ac0af8c261a1fed35852a267307
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f1f92d7a674fc4135c4a6f1f52f27796f6a52ac0af8c261a1fed35852a267307].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 41s
109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/qfn2o6rctiy4q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #255

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/255/display/redirect?page=changes>

Changes:

[noreply] Build wheels for Python 3.9

[noreply] Merge pull request #16892 from [BEAM-13755] [Playground] Scroll the

[noreply] Merge pull request #16880 from [BEAM-13963][Playground] Get bucket name

[noreply] Merge pull request #16870 from [BEAM-13874][Playground] Tag multifile

[noreply] Merge pull request #16910 from [BEAM-13724] [Playground] Get the default

[noreply] [BEAM-14008] Fix incorrect guava import (#16966)

[noreply] Fix ignored exception in BatchSpannerRead. (#16960)

[noreply] [BEAM-13917] Improve coverage of databaseio package (#16956)

[noreply] [BEAM-13925] Add entry files to process new prs and pr updates for PR

[noreply] [BEAM-13899] Improve coverage of debug package (#16951)

[noreply] [BEAM-13907] Improve coverage of textio package (#16937)

[noreply] [BEAM-9150] Fix beam_PostRelease_Python_Candidate (python RC validation


------------------------------------------
[...truncated 556.67 KB...]
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.
Mar 01, 2022 4:00:47 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Mar 01, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-01T16:00:50.388Z: Cancel request is committed for workflow job: 2022-03-01_04_45_40-3174255768133400827.
Mar 01, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-01T16:00:50.576Z: Cleaning up.
Mar 01, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-01T16:00:50.664Z: Stopping **** pool...
Mar 01, 2022 4:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-01T16:00:50.708Z: Stopping **** pool...
Mar 01, 2022 4:03:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-01T16:03:15.690Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Mar 01, 2022 4:03:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-03-01T16:03:15.792Z: Worker pool stopped.
Mar 01, 2022 4:03:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-03-01_04_45_40-3174255768133400827 finished with status CANCELLED.
Load test results for test (ID): 26af09ff-74c6-4f0c-b2e6-8344070c924d and timestamp: 2022-03-01T12:45:35.587000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11562.261
dataflow_v2_java11_total_bytes_count             3.74283226E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220301124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3756776cfb9f7902720f53a08f44dbd46b6019309948e9d30688abc3f35fa14
Deleted: sha256:a267dfb59db9b6a0f6f8659eed4bfe48c3d042d161c39dfc4de87d579ab36c5f
Deleted: sha256:a3c24d76b23ea61b41257ed23879581e8ef6bfd63a6678330f7282cd2f4ac576
Deleted: sha256:3c75eea211d9650a1db4e9813830165eed6b96e4fa06131cb8cb73db14e1d880
Deleted: sha256:16302f0c6f6237114c23d3866f7e9a61b65fd1539e6567ca34ad5ef3023b5c78
Deleted: sha256:50f6da7332b2e71d9088d0d7cf857461a38ddeef093f5076ca6ff0226bd0dcd0
Deleted: sha256:49e797c5e407dc44dc3260ae6f0fc74958b34703260ae6015e043d03238b0e0a
Deleted: sha256:b38339c0fc71a1b3f3a641d58a0e615dcea270510f84ce984975b3aab676274b
Deleted: sha256:aecbc092ae914251253981059fbf2ccb4947289032a1976d9c549da864263085
Deleted: sha256:a7c0efcc6628315fcc6f2b079529d8d9b513a1aeba2ad810222703061bc0487e
Deleted: sha256:f52a97c1ea5d05ce002d3442007ffb87789b8a0decf58b583c7a5f654b4eedb2
Deleted: sha256:7c1616c578950dc2b94cc9f8dc7ef86705c9400a015213b0c9b7e305a98a40b6
Deleted: sha256:cd927f263ccfc829085c79de7874f88cfdbd84c6857940346319a107dfc176d2
Deleted: sha256:1981cf2c859767575ef0d92dc04c235b1f86ae4f3c7398c80c4730f47f63a7d3
Deleted: sha256:732d4ba3db3c16f52c455df8ff9705682f77dcfa4472dcff7402aad782ae93e6
Deleted: sha256:94ecd413d6ef5453e7f097108fae0879b57612417ecb7b44779f3bc6111dc329
Deleted: sha256:4367e54b34849f85db242af2c1dcba0dd4f2e42055aaf4b0192116e9f9498455
Deleted: sha256:54e2c3e3421ea06f98dee2d34f968601ba56d171fc435198eb06ee519ef2a02d
Deleted: sha256:707cdb5dffb767e23b3693fe787b3b70e9ab660cd0c5e5641ba13e5e8e924496
Deleted: sha256:723adb3d55d75cdeecb902ab9163c1dbba9f144bcb71ecea923e8f4b4382016d
Deleted: sha256:cc7ce91685d6f0f96b1c662b373b3aa9ec2a993c8c8e3b1f90f9cc0f8c9f0778
Deleted: sha256:6f374b7250be22611aef0eb2443185cd9b47b50517096974579a8d958a78fa6b
Deleted: sha256:a91ed18ad4468bda35fe324c83712ff2ff677f3aab198bbc088a8042871277b6
Deleted: sha256:fafc84ade5ace43326963faabcb2394e772a863fbd063cba8dee82f57a1bb568
Deleted: sha256:aa84aaf2836214b43e5f98c184d7a5f92a33e30066fc26e69376c1e1990488b8
Deleted: sha256:eb41e638a57ed0ac0198da6f7d986a20986b31d0528d3cf9c0ce2e371ae2318b
Deleted: sha256:3ec2e59137ba6263a1cd07cf80d515bccb8f7827c1fc65a10403f5ef1ba029c9
Deleted: sha256:899b369799c9efd5145cc86c2bc1098778aab24ace7560ef76e02e3fba232778
Deleted: sha256:f5c012a2c2341e9825bed535371caa19fa5f4599b63f2891729bf1297ea1da8f
Deleted: sha256:38fd2cdd86a67ff18476a0057682be3cc53658c5d67dc3124231d73a16bcc9c6
Deleted: sha256:26bd2ed9467e46c500e04242ee6605642fa3f5ecfdcefee46a10bc6038c2bcc3
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220301124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3756776cfb9f7902720f53a08f44dbd46b6019309948e9d30688abc3f35fa14]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220301124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3756776cfb9f7902720f53a08f44dbd46b6019309948e9d30688abc3f35fa14])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3756776cfb9f7902720f53a08f44dbd46b6019309948e9d30688abc3f35fa14
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3756776cfb9f7902720f53a08f44dbd46b6019309948e9d30688abc3f35fa14
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3756776cfb9f7902720f53a08f44dbd46b6019309948e9d30688abc3f35fa14].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 15s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/3d7hyd3dsutbc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #254

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/254/display/redirect>

Changes:


------------------------------------------
[...truncated 1.22 MB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Feb 28, 2022 4:03:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-28T16:03:25.604Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 28, 2022 4:03:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-28T16:03:25.643Z: Worker pool stopped.
Feb 28, 2022 4:03:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-28_04_45_52-11858468572081011038 finished with status CANCELLED.
Load test results for test (ID): 86409f91-f5b1-411b-bdaf-fcc57df11fb8 and timestamp: 2022-02-28T12:45:47.058000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11569.692
dataflow_v2_java11_total_bytes_count             3.49153289E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220228124338
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5e911027a3b004ed3985c979619d43b8667f8124268411bc6efc81a17cf55f7e
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220228124338]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5e911027a3b004ed3985c979619d43b8667f8124268411bc6efc81a17cf55f7e]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220228124338] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5e911027a3b004ed3985c979619d43b8667f8124268411bc6efc81a17cf55f7e])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5e911027a3b004ed3985c979619d43b8667f8124268411bc6efc81a17cf55f7e
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5e911027a3b004ed3985c979619d43b8667f8124268411bc6efc81a17cf55f7e
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5e911027a3b004ed3985c979619d43b8667f8124268411bc6efc81a17cf55f7e].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 15s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/dol6avvnm2mwi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #253

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/253/display/redirect>

Changes:


------------------------------------------
[...truncated 840.58 KB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/windmill_cache_access.cc:40
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the
Feb 27, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-27T16:00:35.916Z: Cancel request is committed for workflow job: 2022-02-27_04_45_42-4525283117407589383.
Feb 27, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-27T16:00:36.018Z: Cleaning up.
Feb 27, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-27T16:00:36.078Z: Stopping **** pool...
Feb 27, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-27T16:00:36.120Z: Stopping **** pool...
Feb 27, 2022 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-27T16:02:52.923Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 27, 2022 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-27T16:02:52.965Z: Worker pool stopped.
Feb 27, 2022 4:03:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-27_04_45_42-4525283117407589383 finished with status CANCELLED.
Load test results for test (ID): caab9eaf-8109-4d80-a905-c2695a80166d and timestamp: 2022-02-27T12:45:37.243000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11547.894
dataflow_v2_java11_total_bytes_count             3.19826367E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220227124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a1327ed4fd4a06acfbe801b379c7308bb475be2c5ca1938062cd1e8f18d26d02
Deleted: sha256:6710e3b7da000327d1db5c4339465474ba57d60e1db2bae121548153b0d263c8
Deleted: sha256:598c17a70709cab9b3baf541026e7f258d6b57fcaccb34689c995b96ed6a4f45
Deleted: sha256:2cde009e4b7803e35daa881f254ece9bcfc6fb50df0d083e2c5d57eba86d1ad8
Deleted: sha256:e5635e9fd5c92be050cf30f5e08d43864b8cc807aafde84ea2ea6cadafd7b8ed
Deleted: sha256:9f1b1d8c1ac5d63c4fb32ce08b49f59c18dfc5e8eb9416e96efa05f9051125fa
Deleted: sha256:cd30bdcb8dac5023b12dbdc86693b71dfc2eee1d4754b7ba3c716558a5a3f34e
Deleted: sha256:5d0dc58ddce506e1c2f77ab8353f4971f81d6a36bbfe487678805d63c8b5fdf8
Deleted: sha256:037099103f4277dc75cb0e0f9e8ce7718fb5f622c1171fa1c831fd7d3744c3b6
Deleted: sha256:07b78971c9ab10009f0b8b78a8324e317c007fe36435a614c424a0de0d141912
Deleted: sha256:a13577ae76d6b58e41d3e382f68356f06e65db046175a696839ddcc4a582d464
Deleted: sha256:50e3edc9dfda1570baeeffea9e8e4d2dbafb1b0e1c2c273d891e5f39208c9236
Deleted: sha256:b29a000c53e10c0ee3e6db9e343324202e43767f14e55b705bbdd6ec1f5cda3a
Deleted: sha256:68569259a67f97c10f81be06ec0b5a2ad66867f65820382934a6212dff5bcfc3
Deleted: sha256:7241420afd5e5e87d8806b3e95e5a117ac90eebfbccae2d6ba40031708e61be7
Deleted: sha256:e8d59fb05298fd1fba8361e751d9ea04517a73a9575f77f290e573bf046edb14
Deleted: sha256:1b3cd64c7cf6e058f0ae7302df21e1d39498304e57718a974f414ee321b412a9
Deleted: sha256:1162861b6441979a773a1bccd003c024e7140ebf20528b1ac3d168f13aa8b500
Deleted: sha256:382025fb547966f51c506765ae1656f4451bb818ea4c6b2880f2dbd96f27c762
Deleted: sha256:8c902f3f210bada370e5a6103ce545076214a1bc01caf118d0792db83da43ddc
Deleted: sha256:41f7418504cae0d13fd0842bf0815a3657c9aa843d8acd4ad445c06271fefa7a
Deleted: sha256:ba549dd1924af9c87c287fa0db0d741fdf193afa4762c5117cef39f9082d530c
Deleted: sha256:c5eaf11ac52476b7c784db3c0c610cdf982b2e3add3fccc0555c69c3525209e4
Deleted: sha256:0e9ec756de07dc05bf50aa2eb243cab1410c4f5b5b34b53da61c14057cfc3ef5
Deleted: sha256:4957dcb5752414266f018a900a0871b7db39d5b088d6be63f778012637e59d53
Deleted: sha256:873b68cda1ae9f12ca528494d34d9a5e16192b7d712cf24be5d30f38ff7ef548
Deleted: sha256:474505375839d30811ed78c9795bc0a6956fdae47e2d87ca3756dac43da2db62
Deleted: sha256:fd9cfdef5651af22e1704bfeac538349e293fa31fd051d7168f88f137484011b
Deleted: sha256:0ae2fd12e5d673aab19c703917f9b3f73bbac572955af80a2de610c0b10be8e9
Deleted: sha256:d8f633ca169233192eb3d40061c946c6f2e21eafb27be7726f5ed9e951064482
Deleted: sha256:24a75fe045880a8f635c918e150e39b762a83209538e993fb9bb5b01d13ee34e
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220227124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a1327ed4fd4a06acfbe801b379c7308bb475be2c5ca1938062cd1e8f18d26d02]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220227124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a1327ed4fd4a06acfbe801b379c7308bb475be2c5ca1938062cd1e8f18d26d02])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a1327ed4fd4a06acfbe801b379c7308bb475be2c5ca1938062cd1e8f18d26d02
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a1327ed4fd4a06acfbe801b379c7308bb475be2c5ca1938062cd1e8f18d26d02
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a1327ed4fd4a06acfbe801b379c7308bb475be2c5ca1938062cd1e8f18d26d02].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 39s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/gcsbarxganbe2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #252

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/252/display/redirect?page=changes>

Changes:

[Ankur Goenka] [BEAM-13952] Sickbaying

[noreply] Memoize some objects for timer processing to reduce overhead. (#16207)

[noreply] [BEAM-13965] Use TypeDeserializer if type information is available to

[noreply] [BEAM-13912] Add more coverage for dataflow.go (#16903)

[noreply] [BEAM-12563] swaplevel general function for dataframe and series

[noreply] [BEAM-14001] Update coder.go unit tests (#16952)

[noreply] [BEAM-13910] Improve coverage of gcsx package (#16942)

[noreply] [BEAM-13015] Use a DirectExecutor for state since we are just completing


------------------------------------------
[...truncated 490.59 KB...]
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmi
Feb 26, 2022 4:03:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-26T16:03:15.527Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 26, 2022 4:03:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-26T16:03:15.572Z: Worker pool stopped.
Feb 26, 2022 4:03:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-26_04_46_21-5289173685615956349 finished with status CANCELLED.
Load test results for test (ID): cdc5142b-2c38-4b4c-b806-6ae6127c4a67 and timestamp: 2022-02-26T12:46:14.901000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                   11525.5
dataflow_v2_java11_total_bytes_count             2.80314284E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220226124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ee4350a0dcb60ea9fc38df8d39d4754f20747ab6b0098ef209adccf6f4bc11e2
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220226124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ee4350a0dcb60ea9fc38df8d39d4754f20747ab6b0098ef209adccf6f4bc11e2]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220226124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ee4350a0dcb60ea9fc38df8d39d4754f20747ab6b0098ef209adccf6f4bc11e2])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ee4350a0dcb60ea9fc38df8d39d4754f20747ab6b0098ef209adccf6f4bc11e2
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ee4350a0dcb60ea9fc38df8d39d4754f20747ab6b0098ef209adccf6f4bc11e2
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ee4350a0dcb60ea9fc38df8d39d4754f20747ab6b0098ef209adccf6f4bc11e2].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 8s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/j5ktflthhdmwc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #251

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/251/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] Revert PR#16253 due errors with plugin flaky-test-handler

[noreply] Fix BoundedQueueExecutor and StreamingDataflowWorker to actually limit

[noreply] [BEAM-1857] Add Neo4jIO (#15916)

[noreply] [BEAM-13767] Migrate serveral portable runner tasks to use configuration

[noreply] [BEAM-13996] Removing 'No cluster_manager is associated with the

[noreply] [BEAM-13906] Improve coverage of errors package (#16934)

[noreply] [BEAM-13886] unit tests for trigger package (#16935)

[noreply] [BEAM-4767] Remove beam- prefix from release script tags (#16899)

[noreply] [BEAM-13866] Add small unit tests to errorx, make boolean assignment

[noreply] [BEAM-13925] Add most of the supporting files for the pr management

[noreply] Merge pull request #16846 from [BEAM-12164]: Add sad path tests for


------------------------------------------
[...truncated 69.80 KB...]
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
Feb 25, 2022 12:51:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-25T12:51:29.818Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
Feb 25, 2022 3:55:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-25T15:55:23.528Z: Staged package animal-sniffer-annotations-1.20-vt1E38otwrj1wIzR1vDgznQJTsZ3gSYJaOA_wOd1Iqw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/animal-sniffer-annotations-1.20-vt1E38otwrj1wIzR1vDgznQJTsZ3gSYJaOA_wOd1Iqw.jar' is inaccessible.
Feb 25, 2022 3:55:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-25T15:55:24.161Z: Staged package beam-runners-core-java-2.38.0-SNAPSHOT-g1gf2aIDSblOWLSzcUcynwLjKg419MgwyCpDR4jXU8Q.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-java-2.38.0-SNAPSHOT-g1gf2aIDSblOWLSzcUcynwLjKg419MgwyCpDR4jXU8Q.jar' is inaccessible.
Feb 25, 2022 3:55:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-25T15:55:28.621Z: Staged package opencensus-contrib-http-util-0.28.0-ScPbKinx_bL3OSjL6pab0dQPq3zFu2JzAiur2W96eJs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-http-util-0.28.0-ScPbKinx_bL3OSjL6pab0dQPq3zFu2JzAiur2W96eJs.jar' is inaccessible.
Feb 25, 2022 3:55:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-25T15:55:29.444Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Feb 25, 2022 3:58:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-25T15:58:28.375Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Feb 25, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-25T16:00:42.006Z: Cancel request is committed for workflow job: 2022-02-25_04_46_01-8982574014897584564.
Feb 25, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-25T16:00:42.035Z: Cleaning up.
Feb 25, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-25T16:00:42.109Z: Stopping **** pool...
Feb 25, 2022 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-25T16:00:42.173Z: Stopping **** pool...
Feb 25, 2022 4:03:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-25T16:03:13.318Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 25, 2022 4:03:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-25T16:03:13.363Z: Worker pool stopped.
Feb 25, 2022 4:03:20 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-25_04_46_01-8982574014897584564 finished with status CANCELLED.
Load test results for test (ID): c0b28dd9-f76c-4e4f-8e77-4822917b1650 and timestamp: 2022-02-25T12:45:52.531000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11520.807
dataflow_v2_java11_total_bytes_count               2.835687E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220225124344
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4123850e9a343e91eb97d310664b7b3418c71d9e24c7152c47df9ee75c356982
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220225124344]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4123850e9a343e91eb97d310664b7b3418c71d9e24c7152c47df9ee75c356982]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220225124344] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4123850e9a343e91eb97d310664b7b3418c71d9e24c7152c47df9ee75c356982])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4123850e9a343e91eb97d310664b7b3418c71d9e24c7152c47df9ee75c356982
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4123850e9a343e91eb97d310664b7b3418c71d9e24c7152c47df9ee75c356982
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4123850e9a343e91eb97d310664b7b3418c71d9e24c7152c47df9ee75c356982].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 1s
109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/n4gpaj57gqr52

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #250

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/250/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12645] Fix code-cov flakes due to monorepo. (#16925)

[noreply] [BEAM-13969] Deprecate stringx package (#16884)

[noreply] Add Go badge to ReadMe (#16897)

[noreply] [BEAM-13980] Re-add method gone missing in af2f8ee6 (#16918)

[noreply] [BEAM-13884] Improve mtime package (#16924)

[noreply] Minor: Update Go API doc links (#16932)

[noreply] [BEAM-13218] Re-enable

[noreply] Merge pull request #16857 from [BEAM-13662] [Playground] Support

[noreply] Merge pull request #16826 from [BEAM-13870] [Playground] Increase test


------------------------------------------
[...truncated 275.65 KB...]
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid
Feb 24, 2022 4:03:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-24T16:03:19.529Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 24, 2022 4:03:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-24T16:03:19.568Z: Worker pool stopped.
Feb 24, 2022 4:03:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-24_04_47_03-2762069967532466111 finished with status CANCELLED.
Load test results for test (ID): 7679e447-256a-4095-a958-a80c1c535077 and timestamp: 2022-02-24T12:46:56.095000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11424.649
dataflow_v2_java11_total_bytes_count             3.76402655E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220224124440
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9076ce22567ee1b9f39a2d01bc3274e942b9333fd39b43d5b77b5b8ad6d6ad4b
Deleted: sha256:31079c1cd0e8b88d02d173a04a5a251abb6766ad41da3f06d2d763b656d1d2d6
Deleted: sha256:23c50f97613c243ea72cc3c61703d61d4ca3dcf2f78615deb23e5df4c8f6917d
Deleted: sha256:fc7cdd17b8b6bb99f2007b596426e9e3cfd7ed95ce3638445c31c55f5dae6649
Deleted: sha256:f1ce1eef07a834078ca4c6152b54d333e1bcdf93f8104459a2c310f524fae76d
Deleted: sha256:37fbb4edaa5d3ec6750e9ffff7a2df94e3401bedffa06dfa46bcc0898bd7e72d
Deleted: sha256:aac0b18ac0706b91a469d0ecc4b8144fdd66a3105d115cf8e639fe070e91acf5
Deleted: sha256:395ad758dc46cdf3dd3db40cae2afa41fba18da333b51487cef02a6665956054
Deleted: sha256:1217f134588443bf35d55bef44a1e7990fa9962aa7f399075b6bf620d6d76120
Deleted: sha256:f19b3c71f89828591d3936bd9b5c0b9189061218c59deb211d26ee4fcce23eb0
Deleted: sha256:058edbad473d3061365db63dc65a1d822b72e5a1989022a419784bc7a69fe854
Deleted: sha256:a50eb1360d0c6f45a7de1982280e81b2ce3aaaf8a1ee38a926367950101982aa
Deleted: sha256:801ff70683b23c1eed3342d481384d534d4af06c8ede220a0a89932bc23f734a
Deleted: sha256:8d98f5d748b3d72eafc085b1b8037453e222f69a0079a3834fbb67565729e4af
Deleted: sha256:6baf14b6d80b7d68ee35d00af9fb65547c4328bc23ba47f1a235d8375bd783fa
Deleted: sha256:169074b26364b02dc3cec172ee84044cd17cea1c2b652adeeff3abf54968b57f
Deleted: sha256:2a18700acde0efc6886c0f9d37cdd366667343478e1f8a810a4320b7dc982d6d
Deleted: sha256:4cead35f2c8d47bd41539eddbc1b05d7ba5db36a83f01d5fb98433d9d26c8f2d
Deleted: sha256:2b885037c2540f875376662f1f6531e3839992596b1a1af69df2b778e4dc123e
Deleted: sha256:fe2e453174c3417153c7ecbc38d2ab4a681370d09a5885c584eae87222a8efaf
Deleted: sha256:78164900173b7ab463cf8ec5b4296fce19fa7c083966f84eb2a08051c52b305c
Deleted: sha256:5d3df9b5b7ad8d14ea314e3f59d68dee54c85b660ea63817592f73df8417bcf9
Deleted: sha256:592bae978bd39dc5c90c58c7f81bb6c86b71c3ac0e3169473ae9a1de58a4cc58
Deleted: sha256:161e57ae0b490339d88e8c116b4d47b78f03576b32ed8a548f5c3360f7629c86
Deleted: sha256:4ba63a20483625cae0c567abb5f3f640c63e0664daaae6b4fcb8d977a7f5ff01
Deleted: sha256:3969e3218fdfb4a2f73923b934820b0bb403d3d4856d61eeab0d00d8a45944cd
Deleted: sha256:ebd74a1fe3d67fe34b87b25d532565e1531e44c374dce96a2ef91678c74008b6
Deleted: sha256:97b48cee63d1e78a912b774a779bc9188bf4b1def5ff2ac9f62741036f93a6a6
Deleted: sha256:4388cd9d13d809e3eccf6621034bb600c9a88087ff910dbbdf4407eaf77e6282
Deleted: sha256:4dfa9a9d0d69f7387e56db2d0d20e6fa362698bd23b3abcd1f8d642c739fb9eb
Deleted: sha256:3973bc874813844b6c2fa072cd2c36abdf668270e2cb415c2c15df3560940288
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220224124440]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9076ce22567ee1b9f39a2d01bc3274e942b9333fd39b43d5b77b5b8ad6d6ad4b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220224124440] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9076ce22567ee1b9f39a2d01bc3274e942b9333fd39b43d5b77b5b8ad6d6ad4b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9076ce22567ee1b9f39a2d01bc3274e942b9333fd39b43d5b77b5b8ad6d6ad4b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9076ce22567ee1b9f39a2d01bc3274e942b9333fd39b43d5b77b5b8ad6d6ad4b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9076ce22567ee1b9f39a2d01bc3274e942b9333fd39b43d5b77b5b8ad6d6ad4b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 14s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/owkktz3mzfauu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #249

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/249/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-13796] projection pushdown in BQ IO

[Kyle Weaver] [BEAM-13796] Move test to ReadTest class and correct javadoc for

[Kyle Weaver] [BEAM-13796] Pushdown is not supported on TypedRead#fromQuery.

[noreply] [BEAM-13738] Reenable ignored SQS test after bumping elasticmq for fixed

[noreply] fix build status link (#16907)

[noreply] Merge pull request #16549 from [BEAM-13681][Playground] Refactoring

[noreply] Merge pull request #16732 from [BEAM-13825] [Playground] updated

[noreply] Merge pull request #16683 from [BEAM-13713][Playground] Java graph

[noreply] case study pages - improvements and fixes (#16896)

[noreply] Palo Alto case study (#16915)


------------------------------------------
[...truncated 86.10 KB...]
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
Feb 23, 2022 12:53:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-23T12:53:28.071Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
Feb 23, 2022 12:53:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-23T12:53:28.936Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-7' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
	at com.sun.proxy.$Proxy126.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1213)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1205)
	at hudson.Launcher$ProcStarter.join(Launcher.java:522)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:806)
	at hudson.model.Build$BuildExecution.build(Build.java:198)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:514)
	at hudson.model.Run.execute(Run.java:1888)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:99)
	at hudson.model.Executor.run(Executor.java:432)
Caused by: java.io.IOException: Unexpected termination of the channel
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:75)
Caused by: java.io.EOFException
	at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2799)
	at java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:3274)
	at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:934)
	at java.io.ObjectInputStream.<init>(ObjectInputStream.java:396)
	at hudson.remoting.ObjectInputStreamEx.<init>(ObjectInputStreamEx.java:49)
	at hudson.remoting.Command.readFrom(Command.java:142)
	at hudson.remoting.Command.readFrom(Command.java:128)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:35)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-7 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #247

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/247/display/redirect>

Changes:


------------------------------------------
[...truncated 649.46 KB...]
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streamin
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-9' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
	at com.sun.proxy.$Proxy121.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1213)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1205)
	at hudson.Launcher$ProcStarter.join(Launcher.java:522)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:806)
	at hudson.model.Build$BuildExecution.build(Build.java:198)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:514)
	at hudson.model.Run.execute(Run.java:1888)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:99)
	at hudson.model.Executor.run(Executor.java:432)
Caused by: java.io.IOException: Pipe closed after 0 cycles
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126)
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105)
	at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:93)
	at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:74)
	at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:104)
	at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #246

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/246/display/redirect?page=changes>

Changes:

[rogelio.hernandez] [BEAM-13051] Pylint misplaced-bare-raise warning enabled

[Jeff Tapper] Update Java LTS roadmap info on website for Java 17

[Pablo Estrada] Simplify README for new users

[Kyle Weaver] [BEAM-13106] Support Flink 1.14.

[Kyle Weaver] [BEAM-13106] Reuse executor instead of shutting it down mid-test.

[Kyle Weaver] [BEAM-13106] Prevent infinite wait in Flink savepoint test.

[Ismaël Mejía] [BEAM-13202] Fix typos on tests names for VarianceFnTest

[Kenneth Knowles] Disable AfterSynchronizedProcessingTime test on Dataflow

[Ismaël Mejía] [BEAM-13202] Add Coder to CountIfFn.Accum

[Ismaël Mejía] [BEAM-13202] Reuse Count transform code since CountIf is a specific case

[Kenneth Knowles] Add test category UsesProcessingTimeTimers

[Kenneth Knowles] Label tests that need UsesProcessingTimeTimers

[Kenneth Knowles] Exclude UsesProcessingTimeTimers from SamzaRunner tests

[Kyle Weaver] [BEAM-13106] A couple additional fixes to FlinkSavepointTest.

[mmack] [adhoc] Migrate KinesisIOIT to use ITEnvironment for Localstack based IT

[rogelio.hernandez] [BEAM-13051] Added descriptions to Kinesis and PortableRunner exceptions

[noreply] [BEAM-13955] Fix pylint breakage from #16836 (#16867)

[relax] Fix TableRow conversion for the case of fields named "f"

[noreply] Bump dataflow.fnapi_container_version (#16874)

[mmack] [BEAM-13563] Introducing common AWS ClientBuilderFactory to unify

[laraschmidt] Fix final allowskew error to properly handle a large allowedSkew

[noreply] Case studies page improvements (#16702)

[noreply] [BEAM-13946] Add get_dummies(), a non-deferred column operation on

[noreply] [release-2.36.0] Fix pickler argument for 2.36 blog (#16774)

[thiagotnunes] fix: fix bug when retrieving either string or json

[noreply] [adhoc] Avoid using SerializablePipelineOptions for testing to minimize

[noreply] [BEAM-13812] Integrate DataprocClusterManager into Interactive

[noreply] [BEAM-12572] Fix failing python examples tests in Dataflow runner

[noreply] Remove build status from PR (#16902)


------------------------------------------
[...truncated 962.91 KB...]
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/strea
Feb 19, 2022 4:03:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-19T16:03:35.307Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 19, 2022 4:03:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-19T16:03:35.346Z: Worker pool stopped.
Feb 19, 2022 4:03:41 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-19_04_48_48-3087146533562638637 finished with status CANCELLED.
Load test results for test (ID): b0fe1192-e3fd-4359-a3cd-15e1bd23f94c and timestamp: 2022-02-19T12:48:42.648000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11390.007
dataflow_v2_java11_total_bytes_count             3.63530377E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220219124635
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0845c54623ae950186c40c626e985c26e6f4208adc845ed37f746edda062b967
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220219124635]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0845c54623ae950186c40c626e985c26e6f4208adc845ed37f746edda062b967]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220219124635] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0845c54623ae950186c40c626e985c26e6f4208adc845ed37f746edda062b967])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0845c54623ae950186c40c626e985c26e6f4208adc845ed37f746edda062b967
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0845c54623ae950186c40c626e985c26e6f4208adc845ed37f746edda062b967
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Sat, 19 Feb 2022 16:03:48 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:0845c54623ae950186c40c626e985c26e6f4208adc845ed37f746edda062b967': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 17m 28s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/uzgnwtb3xlkeg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #243

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/243/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12712] Spark: Exclude looping timer tests.

[Kyle Weaver] [BEAM-13919] Annotate PerKeyOrderingTest with UsesStatefulParDo.

[noreply] Update 2.36.0 blog post to mention ARM64 support

[stranniknm] [BEAM-13785] playground - enable scio sdk

[noreply] Minor: Disable checker framework in nightly snapshot (#16829)

[artur.khanin] Updated example link

[noreply] [BEAM-13860] Make `DoFn.infer_output_type` return element type (#16788)

[noreply] [BEAM-13894] Unit test utilities in the ptest package (#16830)

[Kenneth Knowles] Add test for processing time continuation trigger

[noreply] [BEAM-13922] [Coverage] Make boot.go more testable and add tests

[noreply] Exclude SpannerChangeStream IT from Dataflow V1 postcommit (#16851)

[noreply] [BEAM-13930] Address StateSpec consistency issue between Runner and Fn

[mattcasters] [BEAM-13854] Document casting trick for Avro value serializer in KafkaIO

[noreply] Merge pull request #16838 from [BEAM-13931] - make sure large rows cause

[noreply] Seznam Case Study (#16825)

[noreply] [Website] Apache Hop Case Study (#16824)

[noreply] [BEAM-13694] Force hadoop-hdfs-client in hadoopVersion tests for hdfs

[noreply] [Website] Ricardo - added case study feedback (#16807)

[noreply] Merge pull request #16735 from [BEAM-13827] - fix medium file size

[noreply] Merge pull request #16753 from [BEAM-13837] [Playground] show graph on


------------------------------------------
[...truncated 1.18 MB...]
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server
Feb 16, 2022 4:00:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T16:00:51.140Z: Cancel request is committed for workflow job: 2022-02-16_04_45_28-8394044303879218212.
Feb 16, 2022 4:00:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T16:00:51.179Z: Cleaning up.
Feb 16, 2022 4:00:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T16:00:51.249Z: Stopping **** pool...
Feb 16, 2022 4:00:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T16:00:51.291Z: Stopping **** pool...
Feb 16, 2022 4:03:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T16:03:10.773Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 16, 2022 4:03:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T16:03:10.807Z: Worker pool stopped.
Feb 16, 2022 4:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-16_04_45_28-8394044303879218212 finished with status CANCELLED.
Load test results for test (ID): 8256999b-a4d6-417c-976c-89b173a2d457 and timestamp: 2022-02-16T12:45:22.430000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11578.599
dataflow_v2_java11_total_bytes_count             2.90091086E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220216124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13939611991a56d92a65b5c4f5ee7d772fbe526008063aefa99282e90c6ce59
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220216124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13939611991a56d92a65b5c4f5ee7d772fbe526008063aefa99282e90c6ce59]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220216124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13939611991a56d92a65b5c4f5ee7d772fbe526008063aefa99282e90c6ce59])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13939611991a56d92a65b5c4f5ee7d772fbe526008063aefa99282e90c6ce59
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13939611991a56d92a65b5c4f5ee7d772fbe526008063aefa99282e90c6ce59
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13939611991a56d92a65b5c4f5ee7d772fbe526008063aefa99282e90c6ce59].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 2s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4aphwbydqjxc2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #241

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/241/display/redirect?page=changes>

Changes:

[akustov] fix name project id from secreton scio deploy action

[alexander.zhuravlev] [BEAM-13775] Fixed bug with run button

[ihr] [BEAM-13836] Fix the answers placeholders locations in the Python katas

[noreply] Merge pull request #16703 from [BEAM-13804][Playground][Bugfix] Add

[noreply] Merge pull request #16611 from [BEAM-13712][Playground] Add graph for

[noreply] Merge pull request #16757 from [BEAM-13655] [Playground] Persist the


------------------------------------------
[...truncated 323.39 KB...]
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-4' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
	at com.sun.proxy.$Proxy135.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1211)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1203)
	at hudson.Launcher$ProcStarter.join(Launcher.java:523)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:803)
	at hudson.model.Build$BuildExecution.build(Build.java:197)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:513)
	at hudson.model.Run.execute(Run.java:1906)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.io.IOException: Pipe closed after 0 cycles
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:118)
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:101)
	at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:93)
	at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:74)
	at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:104)
	at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-4 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #240

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/240/display/redirect?page=changes>

Changes:

[randomstep] [BEAM-9195] Bump org.testcontainers to 1.16.3


------------------------------------------
[...truncated 1.37 MB...]
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    ./dist_proc/dax/workflow/****/streaming/windowing_api_delegate.h:67
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passe
Feb 13, 2022 4:03:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-13T16:03:36.711Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 13, 2022 4:03:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-13T16:03:36.772Z: Worker pool stopped.
Feb 13, 2022 4:03:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-13_04_45_29-16751563456799598367 finished with status CANCELLED.
Load test results for test (ID): a0c63359-f6d7-41a1-8822-bfafbefab0e2 and timestamp: 2022-02-13T12:45:24.086000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11570.501
dataflow_v2_java11_total_bytes_count             2.86910428E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220213124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:300c8c379ee0ed0c206591cafa4153221622ad1f4b2e630d136e5d5450998b1f
Deleted: sha256:0efaf401e4dfd45c8b6610e69abe5a32cd18e429fb3b7f240556e841f563a532
Deleted: sha256:01e5dd960b2a93b18f7ada1d70ec9164064bb8b7fbda1133010259154634412d
Deleted: sha256:04e6ceb2bf217fe3be73caac295c796fc3b9d7b90271acdf57b480ad6d55f00e
Deleted: sha256:c9269396ebcb4100532180e9d75704880f3f0140a2f3287e5161024fa8e3ef55
Deleted: sha256:76c9c071c9957f427e216522d3384612bf6b8502fe0df003f65bc7a015712d86
Deleted: sha256:d5dbc69339b77fe4dc5d404f4bb6207c16a016bf2929e23a05497639b68ceaf7
Deleted: sha256:5cb8bb9f69d384cbf9fd9d666cb8fd465919565f78216ede0d8371e7dea5cdeb
Deleted: sha256:a58537abc1af3d96aae3548f63fc142da1044f812647c7d1dfd46c0fe9d531f1
Deleted: sha256:93152afc2418154443dd9533105b918b1a17d3cab810dd7e310b557e066032af
Deleted: sha256:0a9615f262d8984f29d2eef4f8550785e0857528d30cab69014ce323c8ce4287
Deleted: sha256:e8072a51e81a2d17e25442a873510626379a0d3d7d48375df02a634f8313daf4
Deleted: sha256:7eed13aa46eee45114b44de0318d658d1b85625fd86bc0fe2114faf4834bc991
Deleted: sha256:923cac7c3bdbfc5ac53bed6f46eef1a46b9d8b8c3a730e94b203154f1df900a5
Deleted: sha256:efb875a7a7f0883126c5cbf7d6412bfba82c070104e1e7299ead13d92060dc28
Deleted: sha256:bf06d700b94112c0d896d12369e5d2ebefcc7d68447d2b56854adc2ebf3348ee
Deleted: sha256:28d3c70bf2ba013f14c3fdddf037a627d136ba36964ed5dc274727a8073d9896
Deleted: sha256:c8308f79a6b9f4fb781ef874e6b7a744ff4ec799afd089e819c380fdcea0e62d
Deleted: sha256:4447d4ae46e5241ecb14a886138def92d65711a1acebc3f089680017caccee6b
Deleted: sha256:61d53d8b46135cf6b2791881129560027d158c32097554118b673625a3088cf6
Deleted: sha256:e587c0c24f8c0e1b410b1dd9eaa20464a4d24cf589b77324abcb33c492d5e3dc
Deleted: sha256:ffcd66ed87283be0fc6fe081ec6a6b08bdbc91f2adaab2bd24c29dc32dc0276d
Deleted: sha256:546320789d12dc1cbd3ede0559f1f5a74fb49ad9c65b7b4b87b43c068008aad5
Deleted: sha256:e8db8686d1ae283643c94751f3d7a519cdd2efa8fb0951ba51e0bc4cb44141c1
Deleted: sha256:d8506ed0b04fcf9dba5d6c924d08fa68f0886da320a040263bc7882156c1ffcb
Deleted: sha256:0afcb583a9b0b79e5b726e68bd838b1c9eef66a811715e9b37983beb714d09d6
Deleted: sha256:1df4f5096ed24e1c1943b2e6e40ddf6776db90787a31f2f52a9085bc7b0a7ddb
Deleted: sha256:86041402d03c5cbdb14922e72b33e51f83291b9ca319142ff1b5eb98b8ec3275
Deleted: sha256:da813778abb691c2cc907faa4b9809c66b061ebbef58af9aa3273e44f556040d
Deleted: sha256:5366fe34eec601661fd760e5c729aed13e6b9f76a29799c42de7119b932f3e06
Deleted: sha256:fbedd8cd30751c134eda9ee4a9f50b35bbc592af444530fd60613aa2485598c7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220213124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:300c8c379ee0ed0c206591cafa4153221622ad1f4b2e630d136e5d5450998b1f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220213124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:300c8c379ee0ed0c206591cafa4153221622ad1f4b2e630d136e5d5450998b1f])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:300c8c379ee0ed0c206591cafa4153221622ad1f4b2e630d136e5d5450998b1f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:300c8c379ee0ed0c206591cafa4153221622ad1f4b2e630d136e5d5450998b1f
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:300c8c379ee0ed0c206591cafa4153221622ad1f4b2e630d136e5d5450998b1f].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 29s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/znyxjhfqxgdrq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #239

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/239/display/redirect?page=changes>

Changes:

[Carl Yeksigian] Cache bucket matcher regex in GcsPath

[benjamin.gonzalez] [BEAM-12672] Retry flaky tests

[benjamin.gonzalez] [BEAM-12672] Fix spotlessApply

[laraschmidt] Fixing the log line to properly handle a large allowed skew.

[n] BEAM-13159 Update embedded-redis dependency

[n] address comments

[noreply] Minor: Add 2.38.0 section to CHANGES.md (#16804)

[noreply] [BEAM-12000] Fix typo in portable Python job definition (#16812)

[noreply] [BEAM-12164]: Fixes SpannerChangeStreamIT (#16806)

[noreply] [BEAM-12572] Fix failures in python examples tests (#16781)

[noreply] [BEAM-13921] filter out debeziumIO test for spark runner (#16815)

[noreply] [BEAM-13855] Skip SpannerChangeStreamOrderedWithinKeyIT and

[noreply] [BEAM-13679] playground - move quick start category to the top (#16808)

[noreply] Update license_script.sh (#16789)

[noreply] [BEAM-13908] [Coverage] Better testing coverage for gcpopts (#16816)

[noreply] Merge pull request #16809 from [BEAM-12164] Added integration test for

[noreply] [BEAM-4032]Support staging binary distributions of dependency packages

[noreply] [BEAM-13834] Increase influxDB persistent storage. (#16817)

[noreply] Minor: Fix link to nexmark benchmarks (#16803)

[noreply] Regenerate python container base_image_requirements.txt (#16832)


------------------------------------------
[...truncated 948.86 KB...]
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server
Feb 12, 2022 4:03:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-12T16:03:27.649Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 12, 2022 4:03:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-12T16:03:27.697Z: Worker pool stopped.
Feb 12, 2022 4:03:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-12_04_46_18-10455573273840819178 finished with status CANCELLED.
Load test results for test (ID): 23971631-0aa7-432a-b4f9-07099db99af6 and timestamp: 2022-02-12T12:46:12.894000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11519.96
dataflow_v2_java11_total_bytes_count             3.17940905E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220212124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ecd0439cbc9d1355e6b88471f3f8d594bf81564d33d33ad7a83291e0f93743e
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220212124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ecd0439cbc9d1355e6b88471f3f8d594bf81564d33d33ad7a83291e0f93743e]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220212124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ecd0439cbc9d1355e6b88471f3f8d594bf81564d33d33ad7a83291e0f93743e])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ecd0439cbc9d1355e6b88471f3f8d594bf81564d33d33ad7a83291e0f93743e
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ecd0439cbc9d1355e6b88471f3f8d594bf81564d33d33ad7a83291e0f93743e
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ecd0439cbc9d1355e6b88471f3f8d594bf81564d33d33ad7a83291e0f93743e].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f340e1832ea427685bbba3c2dc9be10588ba8f60eff487e62c21f0f4547a1a58
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f340e1832ea427685bbba3c2dc9be10588ba8f60eff487e62c21f0f4547a1a58
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Sat, 12 Feb 2022 16:03:42 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:f340e1832ea427685bbba3c2dc9be10588ba8f60eff487e62c21f0f4547a1a58': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 20s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/bojdml7mqtlia

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #238

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/238/display/redirect?page=changes>

Changes:

[Carl Yeksigian] Cache bucket matcher regex in GcsPath

[benjamin.gonzalez] [BEAM-12672] Retry flaky tests

[benjamin.gonzalez] [BEAM-12672] Fix spotlessApply

[laraschmidt] Fixing the log line to properly handle a large allowed skew.

[n] BEAM-13159 Update embedded-redis dependency

[n] address comments

[noreply] Minor: Add 2.38.0 section to CHANGES.md (#16804)

[noreply] [BEAM-12000] Fix typo in portable Python job definition (#16812)

[noreply] [BEAM-12164]: Fixes SpannerChangeStreamIT (#16806)

[noreply] [BEAM-12572] Fix failures in python examples tests (#16781)

[noreply] [BEAM-13921] filter out debeziumIO test for spark runner (#16815)

[noreply] [BEAM-13855] Skip SpannerChangeStreamOrderedWithinKeyIT and

[noreply] [BEAM-13679] playground - move quick start category to the top (#16808)

[noreply] Update license_script.sh (#16789)

[noreply] [BEAM-13908] [Coverage] Better testing coverage for gcpopts (#16816)

[noreply] Merge pull request #16809 from [BEAM-12164] Added integration test for

[noreply] [BEAM-4032]Support staging binary distributions of dependency packages


------------------------------------------
[...truncated 63.70 KB...]
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.713Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.744Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.776Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.810Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.846Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.881Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.907Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.935Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:40.970Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.005Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.031Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.054Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.084Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.118Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.138Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.165Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.194Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.228Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.259Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.285Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.313Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.340Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Feb 11, 2022 6:16:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:41.761Z: Starting 5 ****s in us-central1-b...
Feb 11, 2022 6:16:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:16:51.294Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 11, 2022 6:17:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:17:28.020Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 11, 2022 6:18:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:18:26.835Z: Workers have started successfully.
Feb 11, 2022 6:18:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-11T18:18:26.870Z: Workers have started successfully.
Feb 11, 2022 6:19:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-11T18:19:56.995Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
Feb 11, 2022 6:19:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-11T18:19:57.060Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
Feb 11, 2022 6:19:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-11T18:19:57.345Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
Feb 11, 2022 6:19:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-02-11T18:19:58.170Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:214
Build timed out (after 240 minutes). Marking the build as aborted.
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel "hudson.remoting.Channel@365b4851:apache-beam-jenkins-15": Remote call on apache-beam-jenkins-15 failed. The channel is closing down or has closed down
	at hudson.remoting.Channel.call(Channel.java:994)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
	at com.sun.proxy.$Proxy128.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1211)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1203)
	at hudson.Launcher$ProcStarter.join(Launcher.java:523)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:803)
	at hudson.model.Build$BuildExecution.build(Build.java:197)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:513)
	at hudson.model.Run.execute(Run.java:1906)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.io.IOException
	at hudson.remoting.Channel.close(Channel.java:1499)
	at hudson.remoting.Channel.close(Channel.java:1455)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:884)
	at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:110)
	at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:765)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-15 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #237

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/237/display/redirect?page=changes>

Changes:

[david.prieto.rivera] Missing contribution

[noreply] [BEAM-13803] Add support for native iterable side inputs to the Go SDK

[noreply] [BEAM-11095] Better error handling for illegal emit functions (#16776)

[noreply] Merge pull request #16613 from Supporting JdbcIO driver in classpath for

[noreply] Merge pull request #15848 from [BEAM-13835] An any-type implementation

[Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define simple generators.

[Valentyn Tymofieiev] Add a container for Python 3.9.

[Valentyn Tymofieiev] Allow job submission with Python 3.9 on Dataflow runner

[Valentyn Tymofieiev] Add Python 3.9 test suites. Keep Dataflow V1 suites unchanged for now.

[Valentyn Tymofieiev] Add py3.9 Github actions suites.

[Valentyn Tymofieiev] Py39 Doc updates.

[Valentyn Tymofieiev] [BEAM-9980] Simplify run_validates_container.sh to avoid branching.

[Valentyn Tymofieiev] Update Cython to a new version that has py39 wheels.

[Valentyn Tymofieiev] [BEAM-13845] Fix comparison with potentially incomparable default

[Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define simple generators.

[Valentyn Tymofieiev] Mark Python 3.9 as supported version.

[noreply] [release-2.36.0][website] Fix github release notes script, header for

[noreply] Use shell to run python for setupVirtualenv (#16796)

[Daniel Oliveira] [BEAM-13830] Properly shut down Debezium expansion service in IT script.

[noreply] Merge pull request #16659 from [BEAM-13774][Playground] Add user to

[Valentyn Tymofieiev] [BEAM-13868] Remove gsutil dep from hdfs IT test.

[noreply] [BEAM-13776][Playground] (#16731)

[noreply] [BEAM-13867] Drop NaNs returned by nlargest in flight_delays example

[noreply] Announce Python 3.9 in CHANGES.md (#16802)

[Brian Hulette] Moving to 2.38.0-SNAPSHOT on master branch.

[noreply] [BEAM-11095] Better error handling for iter/reiter/multimap (#16794)


------------------------------------------
[...truncated 47.39 KB...]
57aec383ac7b: Pushed
b4164e5f025d: Pushed
535d88b6378e: Pushed
8dda956c1426: Pushed
75f72f6b56b5: Pushed
a0603f3a02d3: Pushed
f3e8e87a4b44: Pushed
a1445b7ad2a8: Pushed
d695f0110876: Pushed
efb3f834d1ce: Pushed
0aa3674558b5: Layer already exists
bf1de93fcdde: Pushed
7c072cee6a29: Layer already exists
1e5fdc3d671c: Layer already exists
bed676ceab7a: Layer already exists
613ab28cf833: Layer already exists
6398d5cccd2c: Layer already exists
0b0f2f2f5279: Layer already exists
edb67dc046f7: Pushed
dd2cb0231f4d: Pushed
20220210124333: digest: sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426 size: 4520

> Task :sdks:java:testing:load-tests:run
Feb 10, 2022 12:45:34 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 10, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Feb 10, 2022 12:45:35 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 10, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds
Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114077 bytes, hash e6bd569948c953f638f65e779a71f8956f2f2cd1860191aaf2b885e327c45633> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5r1WmUjJU_Y49l53mnH4lW8vLNGGAZGq8riF4yfEVjM.pb
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 10, 2022 12:45:39 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3dd31157, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31c628e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3240b2a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58434b19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3fb0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7dbe2ebf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4adc663e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@885e7ff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8bd86c8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa9ab6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d3ef181, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a2341c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e4c0d8c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e3315d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64db4967, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74e6094b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@625dfff3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26350ea2]
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 10, 2022 12:45:39 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@806996, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78b612c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3820bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@376c7d7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4784efd9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fba233d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@427ae189, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a9eb2e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76332405, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@187e5235, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d1d8e1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5434e40c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30c1da48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a65cd8]
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.38.0-SNAPSHOT
Feb 10, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-10_04_45_39-2704250670955367490?project=apache-beam-testing
Feb 10, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-02-10_04_45_39-2704250670955367490
Feb 10, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-02-10_04_45_39-2704250670955367490
Feb 10, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-10T12:45:50.733Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-6o4n. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 10, 2022 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:56.036Z: Worker configuration: e2-standard-2 in us-central1-b.
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:56.896Z: Expanding SplittableParDo operations into optimizable parts.
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:56.925Z: Expanding CollectionToSingleton operations into optimizable parts.
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:56.997Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.066Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.095Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.149Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.251Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.279Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.306Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.339Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.373Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.409Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.431Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.477Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.510Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.542Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.579Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.614Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.648Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.669Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.694Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.726Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.758Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.803Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.827Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.862Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.887Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.907Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:57.935Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Feb 10, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:45:58.287Z: Starting 5 ****s in us-central1-b...
Feb 10, 2022 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:46:02.315Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 10, 2022 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:46:38.629Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 10, 2022 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:47:40.557Z: Workers have started successfully.
Feb 10, 2022 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T12:47:40.593Z: Workers have started successfully.
Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T16:00:36.045Z: Cancel request is committed for workflow job: 2022-02-10_04_45_39-2704250670955367490.
Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T16:00:36.110Z: Cleaning up.
Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T16:00:36.186Z: Stopping **** pool...
Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T16:00:36.239Z: Stopping **** pool...
Feb 10, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T16:02:55.035Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 10, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-10T16:02:55.077Z: Worker pool stopped.
Feb 10, 2022 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-10_04_45_39-2704250670955367490 finished with status CANCELLED.
Load test results for test (ID): 11464781-aed8-45cf-91b2-a0e767eaa5bb and timestamp: 2022-02-10T12:45:34.858000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11526.713
dataflow_v2_java11_total_bytes_count             3.00254704E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220210124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220210124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220210124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ad812c02dd245785430c732fa9769e7eaac5982fbeb695b53b4928449dfd98f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ad812c02dd245785430c732fa9769e7eaac5982fbeb695b53b4928449dfd98f
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Thu, 10 Feb 2022 16:03:13 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:9ad812c02dd245785430c732fa9769e7eaac5982fbeb695b53b4928449dfd98f': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 53s
109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/2ogl5mpme33i4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #236

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/236/display/redirect?page=changes>

Changes:

[noreply] Update README.md

[marco.robles] Update README with latest PreCommit Jobs

[marco.robles] Update Postcommit jobs with latest jobs

[marco.robles] Update Performace job tests in readme

[marco.robles] update load job tests with latest updates

[marco.robles] update other jobs test with latest updates

[marco.robles] mismatch links fix

[marco.robles] update trigger phrase for some postCommit jobs

[marco.robles] correct trigger phrases in readme

[marco.robles] add pending jobs to readme

[noreply] Update README.md

[mmack] [BEAM-13246] Add support for S3 Bucket Key at the object level (AWS Sdk

[Pablo Estrada] Output successful rows from BQ Streaming Inserts

[schapman] BEAM-13439 Type annotation for ptransform_fn

[noreply] [BEAM-13606] Fail bundles with failed BigTable mutations (#16751)

[mmack] [adhoc] Remove remaining usage of Powermock from aws2.

[marco.robles] fix broken links in jobs & remove the invalid ones

[Kyle Weaver] Update Dataflow Python dev container images.

[Kiley Sok] Add java 17 to changes

[noreply] [BEAM-12914] Add missing 3.9 opcodes to type inference. (#16761)

[noreply] [BEAM-13321] Initial BigQueryIO externalization. (#16489)

[noreply] [BEAM-13193] Enable process bundle response elements embedding in Java

[noreply] [BEAM-13830] added a debeziumio_expansion_addr flag to GoSDK (#16780)

[noreply] Apply spotless. (#16783)

[Daniel Oliveira] [BEAM-13732] Switch x-lang BigQueryIO expansion service to GCP one.

[noreply] [BEAM-13858] Fix broken github action on :sdks:go:examples:wordCount

[Kiley Sok] add jira for runner v2

[noreply] [BEAM-13732] Go SDK BigQuery IO wrapper. Initial implementation.

[noreply] [BEAM-13732] Add example for Go BigQuery IO wrapper. (#16786)

[noreply] Update CHANGES.md with Go SDK milestones. (#16787)

[noreply] [BEAM-13193] Allow BeamFnDataOutboundObserver to flush elements.


------------------------------------------
[...truncated 49.28 KB...]
58c06f1c539d: Preparing
1d91539a970d: Preparing
67ac77c9cbe8: Preparing
2607d1e76597: Preparing
0aa3674558b5: Preparing
7c072cee6a29: Preparing
1e5fdc3d671c: Preparing
613ab28cf833: Preparing
bed676ceab7a: Preparing
d77b999a6aa0: Waiting
6398d5cccd2c: Preparing
0b0f2f2f5279: Preparing
5fc56bb17504: Waiting
0aa3674558b5: Waiting
1d91539a970d: Waiting
2607d1e76597: Waiting
7c072cee6a29: Waiting
67ac77c9cbe8: Waiting
a37adf383347: Waiting
869cf694e13b: Waiting
613ab28cf833: Waiting
0b0f2f2f5279: Waiting
6398d5cccd2c: Waiting
58c06f1c539d: Waiting
bed676ceab7a: Waiting
e66cec84260a: Pushed
30502d728975: Pushed
378d28703ba4: Pushed
038dbb835d07: Pushed
5fc56bb17504: Pushed
214a57736d20: Pushed
a37adf383347: Pushed
67ac77c9cbe8: Pushed
58c06f1c539d: Pushed
0aa3674558b5: Layer already exists
d77b999a6aa0: Pushed
869cf694e13b: Pushed
7c072cee6a29: Layer already exists
613ab28cf833: Layer already exists
1e5fdc3d671c: Layer already exists
bed676ceab7a: Layer already exists
0b0f2f2f5279: Layer already exists
6398d5cccd2c: Layer already exists
2607d1e76597: Pushed
1d91539a970d: Pushed
20220209124334: digest: sha256:223356628f1d4c13d8950fd4844c957a25297f6233684232482b34529bf6676f size: 4520

> Task :sdks:java:testing:load-tests:run
Feb 09, 2022 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 09, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Feb 09, 2022 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 09, 2022 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 09, 2022 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 09, 2022 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 1 seconds
Feb 09, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 09, 2022 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114077 bytes, hash dcaa666cf308bd25bd7794484787973f598ff3e67f5cab79a9e4a1ad5b859154> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3KpmbPMIvSW9d5RIR4eXP1mP8-Z_XKt5qeShrVuFkVQ.pb
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 09, 2022 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3dd31157, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31c628e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3240b2a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58434b19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3fb0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7dbe2ebf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4adc663e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@885e7ff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8bd86c8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa9ab6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d3ef181, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a2341c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e4c0d8c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e3315d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64db4967, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74e6094b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@625dfff3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26350ea2]
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 09, 2022 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@806996, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78b612c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3820bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@376c7d7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4784efd9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fba233d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@427ae189, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a9eb2e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76332405, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@187e5235, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d1d8e1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5434e40c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30c1da48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a65cd8]
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Feb 09, 2022 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-09_04_45_28-12291619623864716731?project=apache-beam-testing
Feb 09, 2022 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-02-09_04_45_28-12291619623864716731
Feb 09, 2022 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-02-09_04_45_28-12291619623864716731
Feb 09, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-09T12:45:38.543Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-mkes. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:44.030Z: Worker configuration: e2-standard-2 in us-central1-b.
Feb 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:44.706Z: Expanding SplittableParDo operations into optimizable parts.
Feb 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:44.744Z: Expanding CollectionToSingleton operations into optimizable parts.
Feb 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:44.810Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:44.873Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:44.894Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:44.964Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.070Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.100Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.136Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.165Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.190Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.267Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.299Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.331Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.361Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.388Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.419Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.464Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.523Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.567Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.606Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.628Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.665Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.698Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.732Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.755Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.797Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.839Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:45.878Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Feb 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:46.295Z: Starting 5 ****s in us-central1-b...
Feb 09, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:45:53.961Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 09, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:46:28.357Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 09, 2022 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:47:31.571Z: Workers have started successfully.
Feb 09, 2022 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T12:47:31.602Z: Workers have started successfully.
Feb 09, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T16:00:41.127Z: Cancel request is committed for workflow job: 2022-02-09_04_45_28-12291619623864716731.
Feb 09, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T16:00:41.214Z: Cleaning up.
Feb 09, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T16:00:41.289Z: Stopping **** pool...
Feb 09, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T16:00:41.336Z: Stopping **** pool...
Feb 09, 2022 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T16:03:07.241Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 09, 2022 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-09T16:03:07.272Z: Worker pool stopped.
Feb 09, 2022 4:03:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-09_04_45_28-12291619623864716731 finished with status CANCELLED.
Load test results for test (ID): 10e3dad5-0b89-41b6-b29a-e409ab737c1e and timestamp: 2022-02-09T12:45:22.814000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11560.618
dataflow_v2_java11_total_bytes_count             2.20859091E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220209124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:223356628f1d4c13d8950fd4844c957a25297f6233684232482b34529bf6676f
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220209124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:223356628f1d4c13d8950fd4844c957a25297f6233684232482b34529bf6676f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220209124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:223356628f1d4c13d8950fd4844c957a25297f6233684232482b34529bf6676f])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:223356628f1d4c13d8950fd4844c957a25297f6233684232482b34529bf6676f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:223356628f1d4c13d8950fd4844c957a25297f6233684232482b34529bf6676f
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:223356628f1d4c13d8950fd4844c957a25297f6233684232482b34529bf6676f].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 59s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/2nwxap4lptptq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #235

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/235/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12976] Log projection pushdown optimizations.

[benjamin.gonzalez] [BEAM-12572] Change jobs to run as cron jobs

[Ismaël Mejía] [BEAM-13839] Upgrade zstd-jni to version 1.5.2-1

[mmack] [BEAM-13840] Fix usage of legacy rawtypes in AWS modules

[alexander.zhuravlev] [BEAM-13820] Changed color of delete icon in pipeline options dropdown,

[noreply] [BEAM-11971] Revert "Fix timer consistency in direct runner" (#16748)

[noreply] [BEAM-13193] Aggregates fn api outbound data/timers of different

[noreply] [BEAM-13767] Migrate a bundle of grade tasks to use configuration

[noreply] Merge pull request #16653 from [BEAM-12164]: Add integration tests for

[noreply] Merge pull request #16728 from [BEAM-13823] Update docs for SnowflakeIO

[noreply] Merge pull request #16660 from [BEAM-13771][Playground] Send multifile

[noreply] Merge pull request #16646 from [BEAM-13643][Playground] Setup running

[noreply] [BEAM-13015] Add state caching benchmark and move benchmarks to their

[noreply] [BEAM-13419] Check for initialization in dataflow runner (#16765)

[noreply] Merge pull request #16701 from [BEAM-13786] [Playground] [Bugfix] Update

[noreply] Merge pull request #16754 from [BEAM-13838][Playground] Add logs in case

[noreply] [BEAM-13293] consistent naming for expansion service address and flag

[noreply] Merge pull request #16700 from [BEAM-13790][Playground] Change logic of

[noreply] [BEAM-13830] update dependency for debeziumio expansion service (#16743)

[noreply] [BEAM-13761] consistent namings for expansion address in Debezium IO

[noreply] [BEAM-13806] Shutting down SchemaIO expansion services from Go VR

[noreply] [release-2.36.0] Update website/changelog for release 2.36.0 (#16627)

[noreply] [BEAM-13848] Update numpy intersphinx link (#16767)

[noreply] [release-23.6.0] Fix JIRA link for 2.36 blog (#16771)

[noreply] [BEAM-13647] Use role for Go worker binary. (#16729)


------------------------------------------
[...truncated 49.38 KB...]
0aa3674558b5: Layer already exists
7c072cee6a29: Layer already exists
1e5fdc3d671c: Layer already exists
613ab28cf833: Layer already exists
bed676ceab7a: Layer already exists
6398d5cccd2c: Layer already exists
0b0f2f2f5279: Layer already exists
3f5fa4e217b9: Pushed
79e4afa916f0: Pushed
d0a88de6d715: Pushed
d95d966c943e: Pushed
20220208125609: digest: sha256:de0d91a58578225b2f7d9c60b619a573956dfb912c73dec95c2f349d7da1826a size: 4520

> Task :sdks:java:testing:load-tests:run
Feb 08, 2022 12:59:53 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 08, 2022 12:59:53 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Feb 08, 2022 12:59:55 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 08, 2022 12:59:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 08, 2022 12:59:59 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 08, 2022 1:00:00 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 1 seconds
Feb 08, 2022 1:00:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 08, 2022 1:00:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114077 bytes, hash 91e84a6f8b002deb2d1f044b1a9892dc5e43e3665547133f439fbe112505c373> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kehKb4sALestHwRLGpiS3F5D42ZVRxM_Q5--ESUFw3M.pb
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 08, 2022 1:00:03 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3240b2a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58434b19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3fb0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7dbe2ebf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4adc663e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@885e7ff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8bd86c8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa9ab6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d3ef181, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a2341c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e4c0d8c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e3315d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64db4967, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74e6094b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@625dfff3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26350ea2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e9469b8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a08efdc]
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 08, 2022 1:00:03 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3820bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@376c7d7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4784efd9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fba233d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@427ae189, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a9eb2e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76332405, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@187e5235, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d1d8e1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5434e40c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30c1da48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a65cd8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f1ef9d6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17461db]
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 08, 2022 1:00:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Feb 08, 2022 1:00:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-08_05_00_03-14329417766500327917?project=apache-beam-testing
Feb 08, 2022 1:00:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-02-08_05_00_03-14329417766500327917
Feb 08, 2022 1:00:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-02-08_05_00_03-14329417766500327917
Feb 08, 2022 1:00:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-08T13:00:11.656Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-fq8i. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 08, 2022 1:00:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:16.190Z: Worker configuration: e2-standard-2 in us-central1-b.
Feb 08, 2022 1:00:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:16.905Z: Expanding SplittableParDo operations into optimizable parts.
Feb 08, 2022 1:00:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:16.937Z: Expanding CollectionToSingleton operations into optimizable parts.
Feb 08, 2022 1:00:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.010Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 08, 2022 1:00:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.079Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 08, 2022 1:00:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.106Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 08, 2022 1:00:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.165Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.390Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.454Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.496Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.530Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.565Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.600Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.636Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.681Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.720Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.751Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.783Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.816Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.850Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.880Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.913Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.943Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:17.978Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:18.011Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:18.059Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:18.110Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:18.133Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:18.172Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:18.204Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Feb 08, 2022 1:00:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:18.567Z: Starting 5 ****s in us-central1-b...
Feb 08, 2022 1:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:50.250Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 08, 2022 1:00:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:00:58.016Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 08, 2022 1:01:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:01:58.443Z: Workers have started successfully.
Feb 08, 2022 1:01:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T13:01:58.523Z: Workers have started successfully.
Feb 08, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T16:00:43.652Z: Cancel request is committed for workflow job: 2022-02-08_05_00_03-14329417766500327917.
Feb 08, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T16:00:43.754Z: Cleaning up.
Feb 08, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T16:00:43.812Z: Stopping **** pool...
Feb 08, 2022 4:00:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T16:00:43.853Z: Stopping **** pool...
Feb 08, 2022 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T16:03:03.218Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 08, 2022 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-08T16:03:03.257Z: Worker pool stopped.
Feb 08, 2022 4:03:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-08_05_00_03-14329417766500327917 finished with status CANCELLED.
Load test results for test (ID): 98f52b9f-7fc8-43e3-afd4-0eb01ceb8593 and timestamp: 2022-02-08T12:59:54.393000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 10693.906
dataflow_v2_java11_total_bytes_count             1.46058293E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220208125609
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:de0d91a58578225b2f7d9c60b619a573956dfb912c73dec95c2f349d7da1826a
Deleted: sha256:9966169d97db3b794e3d00b2b4c4052e8f20d6fb790b6379b22db2daf94df1db
Deleted: sha256:3a4bf4ac56d605e7e35f46073c4adcdf0d0f1447cb1cd7c317f0f2f5bd3d81f0
Deleted: sha256:2761c5c5bc2905f7874125019b6f9d800bc1ea901f98db23974e969234688ff3
Deleted: sha256:0eb90dcde92badd20aefce84345e738e4b495440653e1932eb81386e3629cf23
Deleted: sha256:baad742a2ce89bcc1f0d172c089d25d4f4296b2c7cb07371e627920a2783b14c
Deleted: sha256:ad2cad60ef842f597982851df6f4e5671b98aff7b2a11b390fee7799a0dc4db5
Deleted: sha256:76f9c8a34f7b68a18e0b47a0a671c62b4765cc6eb4c19abec8d6f3266d7e3cbc
Deleted: sha256:17c1408e765a1791cac7127b68c3f81f6df369887b8be29e39ffc7477f187a93
Deleted: sha256:b67f7a26d1bffdefa895fad0f6fc5cb69338f499da0ba2973869c0b5faa9a554
Deleted: sha256:c9a917211b827ec2e069c5d3c63e8a761617d148482d17f1622120d65d2de059
Deleted: sha256:7010b557bead148ba239d672c9cf37bb8e257c3ccbf0133a140f7c9aa57a05bd
Deleted: sha256:5ed1065b58c77e7a4d5deff7ee9944e655de4f52ab3b5ec5a480c8b6eaa851ea
Deleted: sha256:6acef123fc6956883728d2fc3be46cb1c5fa146efc8e221eb0fe1cf2525d4156
Deleted: sha256:2a39641026428cb92e175272450ce9e59475b2fc04243c84879cbbcc1741a19e
Deleted: sha256:7fc5d389490a742166f166980214c363b004e94eed53988e786fc53324ff03fd
Deleted: sha256:8ae52dab523527bd7e4435dbc1125db6a18eefc11f28eb0493098bdcd926a4cf
Deleted: sha256:e16a3ee959c56bf6c48ca4c33362ab3989829d2985cff762f934cbbcc9527ee0
Deleted: sha256:6db5826c29e293d615feb426ec25c629f2c4b15a6a75efd7bcc38950c9eb87a2
Deleted: sha256:dbeaa5bbd78d2d7cd7a146b230edfd6094a79266abcce02e32bdbcd8591e6c78
Deleted: sha256:218be1b3302c00a7d2f27bea0cb5ace01ef14ec3e33a61588573bf3c25aa52f0
Deleted: sha256:21f2880dc0e11796b416621158f54de5e19bbd5c5501e6883e6dda616838d535
Deleted: sha256:777905852eaa0fc0a0b26ee69d783c2fe759510f9c01b1ba9632fa37e748a5e3
Deleted: sha256:6202ca84ef19abdd5d9b8bc272216b887892095bfe1d13959383bcb98595d770
Deleted: sha256:bb195afe5443b3e7b773c8cda77cdbadaf75861822f4370614842c40b9215660
Deleted: sha256:139a03ee025a9ff0e1bc4e64f9ba9e22c8c3a7e22e9b2f01adbb57c7f119e96c
Deleted: sha256:7da729d1b31e582ef2c20b6b30e10464921db83e3c7fd2ba5db65174f2c18f07
Deleted: sha256:7a306a9a8889a6528461b65cbde8b8aea5c89e3998fd78fdfea97df5b4efafd5
Deleted: sha256:291ff9501a3ca7f30ca57867af10b2a3f9469aca237d29ccd593539fd4f7730d
Deleted: sha256:d05d2b606c8c876a43ac272fb618a39feb5a426be84900b94f31cef7289764c2
Deleted: sha256:5639912a3f0df986498f5aa9d83715487c92f7b975fdd655bccd28e496cfaad9
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220208125609]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:de0d91a58578225b2f7d9c60b619a573956dfb912c73dec95c2f349d7da1826a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220208125609] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:de0d91a58578225b2f7d9c60b619a573956dfb912c73dec95c2f349d7da1826a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:de0d91a58578225b2f7d9c60b619a573956dfb912c73dec95c2f349d7da1826a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:de0d91a58578225b2f7d9c60b619a573956dfb912c73dec95c2f349d7da1826a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:de0d91a58578225b2f7d9c60b619a573956dfb912c73dec95c2f349d7da1826a].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30c96bbc415702dedc0923f914f0ac2a7fda1f51fb0c397ee3c621eaadc6341c
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30c96bbc415702dedc0923f914f0ac2a7fda1f51fb0c397ee3c621eaadc6341c
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30c96bbc415702dedc0923f914f0ac2a7fda1f51fb0c397ee3c621eaadc6341c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 7m 42s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/op23ssz7lggey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 234 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 234 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/234/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #233

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/233/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13663] Remove unused duplicate option for AWS client configuration

[mmack] [BEAM-13203] Deprecate SnsIO.writeAsync for AWS Sdk v2 due to risk of

[noreply] [BEAM-13828] Fix stale bot (#16734)

[noreply] Merge pull request #16364 from [BEAM-13182]  Add diagrams to backend

[noreply] [BEAM-13811] Fix save_main_session arg in tests examples (#16709)

[Kiley Sok] Update beam-master version

[noreply] [BEAM-13015] Calculate exception for closing BeamFnDataInboundObserver2

[noreply] Minor doc tweaks for validating vendoring. (#16747)

[noreply] [BEAM-13686] OOM while logging a large pipeline even when logging level

[noreply] [BEAM-13629] Update URL artifact type for Dataflow Go (#16490)

[noreply] [BEAM-13832] Add automated expansion service start-up to JDBCio (#16739)

[noreply] [BEAM-13831] Add automated expansion service infra into Debezium Read()

[noreply] [BEAM-13821] Add automated expansion service start-up to KafkaIO

[noreply] [BEAM-13799] Created a Dataproc cluster manager for Interactive Beam

[noreply] Merge pull request #16727: [BEAM-11971] remove unsafe Concurrent data


------------------------------------------
[...truncated 49.78 KB...]
bd18a1a09476: Preparing
6f78efdc0a6b: Preparing
352660b137e6: Preparing
4590f8c89770: Preparing
ca33502d2cac: Preparing
0aa3674558b5: Preparing
7c072cee6a29: Preparing
1e5fdc3d671c: Preparing
613ab28cf833: Preparing
bed676ceab7a: Preparing
6398d5cccd2c: Preparing
84954f958ede: Waiting
0aa3674558b5: Waiting
2a5e893f830d: Waiting
bd18a1a09476: Waiting
352660b137e6: Waiting
7c072cee6a29: Waiting
ca33502d2cac: Waiting
4590f8c89770: Waiting
6f78efdc0a6b: Waiting
1e5fdc3d671c: Waiting
0b0f2f2f5279: Preparing
69984deac861: Waiting
6398d5cccd2c: Waiting
0b0f2f2f5279: Waiting
c0c80369275f: Pushed
ea7e0dd8d579: Pushed
eafcc0a37937: Pushed
0bc31fcf22e4: Pushed
84954f958ede: Pushed
73c3bea82b73: Pushed
2a5e893f830d: Pushed
6f78efdc0a6b: Pushed
bd18a1a09476: Pushed
0aa3674558b5: Layer already exists
7c072cee6a29: Layer already exists
4590f8c89770: Pushed
1e5fdc3d671c: Layer already exists
613ab28cf833: Layer already exists
bed676ceab7a: Layer already exists
69984deac861: Pushed
6398d5cccd2c: Layer already exists
0b0f2f2f5279: Layer already exists
ca33502d2cac: Pushed
352660b137e6: Pushed
20220205125145: digest: sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b size: 4520

> Task :sdks:java:testing:load-tests:run
Feb 05, 2022 12:54:32 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 05, 2022 12:54:33 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Feb 05, 2022 12:54:34 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 05, 2022 12:54:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 05, 2022 12:54:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 05, 2022 12:54:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 1 seconds
Feb 05, 2022 12:54:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 05, 2022 12:54:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114077 bytes, hash da1da67ee3a7a67940df1af68529f9f579d6853ec42a1c3e292b590edd29193b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2h2mfuOnpnlA3xr2hSn59XnWhT7EKhw-KStZDt0pGTs.pb
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 05, 2022 12:54:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@574a89e2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e1e9ef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3dd31157, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31c628e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3240b2a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58434b19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3fb0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7dbe2ebf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4adc663e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@885e7ff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8bd86c8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa9ab6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d3ef181, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a2341c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e4c0d8c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e3315d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64db4967, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74e6094b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b]
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 05, 2022 12:54:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c5228e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38e7ed69, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@806996, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78b612c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3820bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@376c7d7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4784efd9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fba233d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@427ae189, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a9eb2e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76332405, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@187e5235, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d1d8e1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5434e40c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325]
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Feb 05, 2022 12:54:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-05_04_54_41-13484680452619416034?project=apache-beam-testing
Feb 05, 2022 12:54:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-02-05_04_54_41-13484680452619416034
Feb 05, 2022 12:54:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-02-05_04_54_41-13484680452619416034
Feb 05, 2022 12:54:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-05T12:54:49.223Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-9qkg. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 05, 2022 12:54:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:53.857Z: Worker configuration: e2-standard-2 in us-central1-b.
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.395Z: Expanding SplittableParDo operations into optimizable parts.
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.442Z: Expanding CollectionToSingleton operations into optimizable parts.
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.520Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.636Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.713Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.759Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.869Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.908Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:54.973Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.004Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.037Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.079Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.120Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.166Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.194Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.228Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.249Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.276Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.303Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.336Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.367Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.402Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.437Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.474Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.504Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.539Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.573Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.608Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:55.643Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:54:56.146Z: Starting 5 ****s in us-central1-b...
Feb 05, 2022 12:55:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:55:06.874Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 05, 2022 12:55:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:55:40.608Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 05, 2022 12:56:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:56:41.210Z: Workers have started successfully.
Feb 05, 2022 12:56:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T12:56:41.244Z: Workers have started successfully.
Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T16:00:45.291Z: Cancel request is committed for workflow job: 2022-02-05_04_54_41-13484680452619416034.
Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T16:00:45.365Z: Cleaning up.
Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T16:00:45.423Z: Stopping **** pool...
Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T16:00:45.465Z: Stopping **** pool...
Feb 05, 2022 4:03:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T16:03:14.878Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 05, 2022 4:03:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-05T16:03:15.058Z: Worker pool stopped.
Feb 05, 2022 4:03:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-05_04_54_41-13484680452619416034 finished with status CANCELLED.
Load test results for test (ID): 6a3f8a7b-51ed-4375-acb8-a03a87edb427 and timestamp: 2022-02-05T12:54:33.571000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11018.873
dataflow_v2_java11_total_bytes_count             1.31740493E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220205125145
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220205125145]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220205125145] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 12m 5s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/hegfl4wykygb2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #232

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/232/display/redirect?page=changes>

Changes:

[Kiley Sok] Allow Java 17 to be used in SDK

[Kiley Sok] add testing support

[Kiley Sok] Add more testing support for java 17

[Kiley Sok] workaround for jamm

[Kiley Sok] Add Jenkins test for Java 17

[Kiley Sok] Fix jvm hex and skip errorprone

[Kiley Sok] Fix display data for anonymous classes

[Kiley Sok] fix jpms tests

[Kiley Sok] skip zetasql

[Kiley Sok] spotless

[Kiley Sok] spotless

[Kiley Sok] Fix trigger

[Kiley Sok] skip checker framework

[Kiley Sok] fix app name

[Kiley Sok] remove duplicate property check

[Heejong Lee] [BEAM-13813] Add support for URL artifact to extractStagingToPath

[avilovpavel6] Remove Python SQL Test example from catalog

[relax] Fix timer consistency in direct runner

[noreply] [BEAM-13757] adds pane observation in DoFn (#16629)

[Jan Lukavský] Change links to Books from Amazon to Publisher

[noreply] [BEAM-13605] Add support for pandas 1.4.0 (#16590)

[noreply] [BEAM-13761] adds Debezium IO wrapper for Go SDK (#16642)

[noreply] [BEAM-13024] Unify PipelineOptions behavior (#16719)

[noreply] Update sdks/go/pkg/beam/artifact/materialize_test.go

[noreply] Merge pull request #16605 from [BEAM-13634][Playground] Create a

[noreply] Merge pull request #16593 from [BEAM-13725][Playground] Add graph to the

[noreply] Merge pull request #16699 from [BEAM-13789][Playground] Change logic of

[alexander.chermenin] Fixed CSS for Case study page


------------------------------------------
[...truncated 49.87 KB...]
c19bd81b1efb: Waiting
4b8d37f582e3: Preparing
0aa3674558b5: Preparing
7c072cee6a29: Preparing
1e5fdc3d671c: Preparing
613ab28cf833: Preparing
8b589c1403fa: Waiting
bed676ceab7a: Preparing
6398d5cccd2c: Preparing
0b0f2f2f5279: Preparing
0aa3674558b5: Waiting
7c072cee6a29: Waiting
0b0f2f2f5279: Waiting
bed676ceab7a: Waiting
6398d5cccd2c: Waiting
1e5fdc3d671c: Waiting
613ab28cf833: Waiting
e80698b4686c: Waiting
4b8d37f582e3: Waiting
43ca28afbb62: Waiting
bdc334ec4ad3: Waiting
713d88b5bc37: Pushed
f2f97eb50931: Pushed
b672ffe8299a: Pushed
6d802e23ba91: Pushed
891460082312: Pushed
583e128fea73: Pushed
8faf192789a7: Pushed
e80698b4686c: Pushed
8b589c1403fa: Pushed
0aa3674558b5: Layer already exists
7c072cee6a29: Layer already exists
bdc334ec4ad3: Pushed
1e5fdc3d671c: Layer already exists
613ab28cf833: Layer already exists
bed676ceab7a: Layer already exists
c19bd81b1efb: Pushed
6398d5cccd2c: Layer already exists
0b0f2f2f5279: Layer already exists
4b8d37f582e3: Pushed
43ca28afbb62: Pushed
20220204124342: digest: sha256:3ee374fec2e9f88c91a044b0d0ff9cdeae20b4dd8e7b92ec1240d584c61368cc size: 4520

> Task :sdks:java:testing:load-tests:run
Feb 04, 2022 12:46:41 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 04, 2022 12:46:42 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Feb 04, 2022 12:46:43 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 04, 2022 12:46:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 04, 2022 12:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 04, 2022 12:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds
Feb 04, 2022 12:46:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 04, 2022 12:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114077 bytes, hash 595adb75bf06a4399b57cbd9a7a9bc641ea4e3cac2d64784a40e96a902df8ccc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-WVrbdb8GpDmbV8vZp6m8ZB6k48rC1keEpA6WqQLfjMw.pb
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 04, 2022 12:46:49 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30ca0779, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58740366, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47be0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2bc426f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd51d3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b]
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 04, 2022 12:46:49 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50]
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 04, 2022 12:46:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Feb 04, 2022 12:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-04_04_46_49-1907528646063558349?project=apache-beam-testing
Feb 04, 2022 12:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-02-04_04_46_49-1907528646063558349
Feb 04, 2022 12:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-02-04_04_46_49-1907528646063558349
Feb 04, 2022 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-04T12:46:58.149Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-4grb. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:08.245Z: Worker configuration: e2-standard-2 in us-central1-b.
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:08.933Z: Expanding SplittableParDo operations into optimizable parts.
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:08.998Z: Expanding CollectionToSingleton operations into optimizable parts.
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.052Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.117Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.156Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.216Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.322Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.358Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.382Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.406Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.428Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.452Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.477Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.502Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.528Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.549Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.576Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.601Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.634Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.669Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.702Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.735Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.770Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.824Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.858Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.892Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.924Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:09.979Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:10.008Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Feb 04, 2022 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:10.469Z: Starting 5 ****s in us-central1-b...
Feb 04, 2022 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:47:34.446Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 04, 2022 12:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:48:00.452Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 04, 2022 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:49:02.670Z: Workers have started successfully.
Feb 04, 2022 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T12:49:02.722Z: Workers have started successfully.
Feb 04, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T16:00:37.024Z: Cancel request is committed for workflow job: 2022-02-04_04_46_49-1907528646063558349.
Feb 04, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T16:00:37.089Z: Cleaning up.
Feb 04, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T16:00:37.203Z: Stopping **** pool...
Feb 04, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T16:00:37.300Z: Stopping **** pool...
Feb 04, 2022 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T16:02:54.125Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 04, 2022 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-04T16:02:54.177Z: Worker pool stopped.
Feb 04, 2022 4:03:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-04_04_46_49-1907528646063558349 finished with status CANCELLED.
Load test results for test (ID): 1a161861-7766-4ff7-9498-738bfba79e90 and timestamp: 2022-02-04T12:46:42.816000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11338.712
dataflow_v2_java11_total_bytes_count             1.95515622E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220204124342
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee374fec2e9f88c91a044b0d0ff9cdeae20b4dd8e7b92ec1240d584c61368cc
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220204124342]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee374fec2e9f88c91a044b0d0ff9cdeae20b4dd8e7b92ec1240d584c61368cc]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220204124342] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee374fec2e9f88c91a044b0d0ff9cdeae20b4dd8e7b92ec1240d584c61368cc])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee374fec2e9f88c91a044b0d0ff9cdeae20b4dd8e7b92ec1240d584c61368cc
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee374fec2e9f88c91a044b0d0ff9cdeae20b4dd8e7b92ec1240d584c61368cc
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee374fec2e9f88c91a044b0d0ff9cdeae20b4dd8e7b92ec1240d584c61368cc].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0035bdbcee488f123ccaa7ee9d663dfccf8e289224ecbc1989320d0e222f2efa
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0035bdbcee488f123ccaa7ee9d663dfccf8e289224ecbc1989320d0e222f2efa
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0035bdbcee488f123ccaa7ee9d663dfccf8e289224ecbc1989320d0e222f2efa].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 43s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/i7s2cmzcqyfbk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 231 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 231 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/231/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #230

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/230/display/redirect?page=changes>

Changes:

[career] [BEAM-13734] Support cache directories that use GCS buckets

[noreply] Merge pull request #16655 from [BEAM-12164]: Add retry protection to

[noreply] Merge pull request #16586 from [BEAM-13731] FhirIO: Add support for

[noreply] [BEAM-13011] Adds a link to Multi-language Pipelines Tips wiki page

[noreply] [BEAM-12572] Run python examples on multiple runners (#16154)

[noreply] [BEAM-13574] Large Wordcount (#16455)

[noreply] [BEAM-13293] Refactor JDBC IO Go Wrapper (#16686)

[noreply] Edit license script for Java, add manual licenses for xz (#16692)


------------------------------------------
[...truncated 50.96 KB...]
a89e28714e65: Pushed
0afb37a61eef: Pushed
a63956e42953: Pushed
eaae80668d8b: Pushed
120d0635c2e7: Pushed
1a0d8be7fecb: Layer already exists
7c072cee6a29: Layer already exists
1e5fdc3d671c: Layer already exists
613ab28cf833: Layer already exists
ab7303d1380c: Pushed
bed676ceab7a: Layer already exists
6398d5cccd2c: Layer already exists
0b0f2f2f5279: Layer already exists
5f89de82cb39: Pushed
1627f9a0564a: Pushed
20220202124338: digest: sha256:d6539f1dff0ea28028603f51cab425bd32b970be6af0edcbdda5d4c8cc90543f size: 4520

> Task :sdks:java:testing:load-tests:run
Feb 02, 2022 12:46:05 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 02, 2022 12:46:09 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Feb 02, 2022 12:46:10 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 02, 2022 12:46:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 02, 2022 12:46:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 02, 2022 12:46:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds
Feb 02, 2022 12:46:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 02, 2022 12:46:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114077 bytes, hash 451f758bb1d6ad78711bd4be17be988bbca00f3199f0131dd3d03e3e5f3fbae1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-RR91i7HWrXhxG9S-F76Yi7ygDzGZ8BMd09A-Pl8_uuE.pb
Feb 02, 2022 12:46:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 02, 2022 12:46:15 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30ca0779, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58740366, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47be0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2bc426f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd51d3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4]
Feb 02, 2022 12:46:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 02, 2022 12:46:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 02, 2022 12:46:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 02, 2022 12:46:16 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918]
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 02, 2022 12:46:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Feb 02, 2022 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-02_04_46_16-18133998314817617901?project=apache-beam-testing
Feb 02, 2022 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-02-02_04_46_16-18133998314817617901
Feb 02, 2022 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-02-02_04_46_16-18133998314817617901
Feb 02, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-02T12:46:23.607Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-yoet. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:29.105Z: Worker configuration: e2-standard-2 in us-central1-b.
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:29.775Z: Expanding SplittableParDo operations into optimizable parts.
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:29.811Z: Expanding CollectionToSingleton operations into optimizable parts.
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:29.900Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:29.968Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.000Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.056Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.160Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.197Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.229Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.261Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.284Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.317Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.338Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.360Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.418Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 02, 2022 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.454Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.485Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.528Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.563Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.597Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.637Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.669Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.692Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.725Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.748Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.782Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.818Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.861Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:30.885Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Feb 02, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:46:31.214Z: Starting 5 ****s in us-central1-b...
Feb 02, 2022 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:47:00.651Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 02, 2022 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:47:15.595Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 02, 2022 12:48:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:48:17.631Z: Workers have started successfully.
Feb 02, 2022 12:48:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T12:48:17.661Z: Workers have started successfully.
Feb 02, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T16:00:36.272Z: Cancel request is committed for workflow job: 2022-02-02_04_46_16-18133998314817617901.
Feb 02, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T16:00:36.354Z: Cleaning up.
Feb 02, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T16:00:36.436Z: Stopping **** pool...
Feb 02, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T16:00:36.494Z: Stopping **** pool...
Feb 02, 2022 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T16:02:54.502Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 02, 2022 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-02T16:02:54.547Z: Worker pool stopped.
Feb 02, 2022 4:03:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-02-02_04_46_16-18133998314817617901 finished with status CANCELLED.
Load test results for test (ID): 2b66bcb4-e5c7-4d30-ae5d-c1d3bf260d18 and timestamp: 2022-02-02T12:46:09.836000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11510.768
dataflow_v2_java11_total_bytes_count             2.21004923E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220202124338
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6539f1dff0ea28028603f51cab425bd32b970be6af0edcbdda5d4c8cc90543f
Deleted: sha256:123eaad56948b21de901271c8288dda548b1d1c337870c6d0a6b34d51ddb81a7
Deleted: sha256:e2980f17f1d47b5b1cc1b003ad784c26eb689fca3db3c7f442ae9cf6c1201f6d
Deleted: sha256:730733f9045c510868de9ed4aedfa46484628a275f8d98b26e21b2ec131ef5c4
Deleted: sha256:b18d377527fa5e1646de41810fb19b9bef72fc949b7e0abe60adf6dfe0178908
Deleted: sha256:2e479672d36f9c63bc3d3a5aa20fc17546f982ce55ca319b125a1daa989cc21b
Deleted: sha256:fcba682443676300048dc2ce59410286fd4222f0ebcfee142523fb32d4d8cb1b
Deleted: sha256:8f5c5b69f9865365f7cc0ed03d4d56e089fb158ea1811a11c924482d1f23905a
Deleted: sha256:40fe6748cb47fd0de85890888dc1a3c78df6b612c54dcc6460ef135f96c805cd
Deleted: sha256:e8852104807b3242f1e85fbf74c3425da5044dedfe45896885e1b9e8f421f303
Deleted: sha256:b29192f6950f8cb8cb11a0db44cfe0229da2b32c82480114ba7ca005628c8075
Deleted: sha256:a73875698268c2f344f16db92548e71055668cb8f5085db62cb1aeb7447cbb68
Deleted: sha256:e84fcfa3dadb190d2b1b9ad8f67617b13b9fa9f0fb29c0b97f080558de701003
Deleted: sha256:d6b0b9197faa7c1532864171a822953245fefe042e56cfe9cfea9e612f60cd2c
Deleted: sha256:fd3e77cf34325ee83c545989c2697d70d2c62da0bad2eed2d1d73c11a9d7b633
Deleted: sha256:fca7aba841a7d8f8c3e2b594d9cb885e548f130f50e79db1f7841c37cd8c33a7
Deleted: sha256:4dd2f552cddf2f1281ec8eca6e4205afdaf6db7c4f30c045fed692984b6070c3
Deleted: sha256:dc5ab14744238929d3c6c5060a67b3d77d248b4a09ba71818870e18ae3f157bd
Deleted: sha256:fa3f27991d3fc074b8c9db6ea261f5810d6bfe022f04218e886e3e125ab299a4
Deleted: sha256:da4d986c718ae5951f69826f84f3301eed70f469d650aff76a842fdb38a318fe
Deleted: sha256:3848d096414abddcf8e63df38197ef168a3c3b5ef712c92168773b9fae558f54
Deleted: sha256:ff8508d2ab472eaee54b8f7d18e46f87a920d50b17ee299a23152dea5c84f40a
Deleted: sha256:84e16c5471ab9455dfbe6611aa9dc4d8b4ace162083803e7bb615b6d186cc665
Deleted: sha256:cf9ce6b88e5d9b03732ce3468dfefd51dacd3241d09d715785d03a1ca4a9be40
Deleted: sha256:f92c02afe0d6dc19b3af79f00a6150a340d5b24816c1fb2334e3dedbbc780c0a
Deleted: sha256:902e329add528333e47e8bce53d9532f0cacdf972486fb0a761e3b1a93bbdc08
Deleted: sha256:bb0e41ddd4ed447d9ca3c3ae6a77ba8c32ecc999039caa0018cc7fe69de89748
Deleted: sha256:d7da683a33932f4056c2ed1acbff035abdba829b17d6464510ec87a88eefe8eb
Deleted: sha256:3a64b0bf9d343c643b4dcd44b770775692691814da26a3e1dc1faced836badfc
Deleted: sha256:952a00834a4f1310976619f2dcea04baac636f483b6c8d1edbe3a602aea1b93f
Deleted: sha256:1663d22179d31f6ccc3681af813c70396a6e74f2eb28a68d723b230030b4502b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220202124338]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6539f1dff0ea28028603f51cab425bd32b970be6af0edcbdda5d4c8cc90543f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220202124338] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6539f1dff0ea28028603f51cab425bd32b970be6af0edcbdda5d4c8cc90543f])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6539f1dff0ea28028603f51cab425bd32b970be6af0edcbdda5d4c8cc90543f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6539f1dff0ea28028603f51cab425bd32b970be6af0edcbdda5d4c8cc90543f
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6539f1dff0ea28028603f51cab425bd32b970be6af0edcbdda5d4c8cc90543f].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 45s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/hb36ptj3lj7wa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #229

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/229/display/redirect?page=changes>

Changes:

[daria.malkova] Support SCIO SDK via sbt projects

[samuelw] [BEAM-11648] Share thread pool across RetryManager instances.

[Pablo Estrada] Exclude per-key order tests on Twister2 runner

[Heejong Lee] Fix Java SDK container image name for load-tests and nexmark

[daria.malkova] Change executable name fo go tests

[avilovpavel6] Fix java test

[noreply] [BEAM-13769] Skip test_main_session_not_staged_when_using_cloudpickle

[noreply] [BEAM-6744] Support implicitly setting project id in Go Dataflow runner

[noreply] Merge pull request #16493 from [BEAM-13632][Playground] Save catalog

[noreply] Exclude jul-to-slf4j from Spark runner in quickstart POM templates

[noreply] [BEAM-11936] Enable a few errorprone checks that were broken by pinned

[noreply] [BEAM-13780] Add CONTRIBUTING.md pointing to main guide (#16666)

[noreply] [BEAM-13777] Accept cache capacity as input parameter instead of default

[noreply] [BEAM-13051][A] Enable pylint warnings

[noreply] [BEAM-13779] Fix pr labeling (#16665)

[noreply] Merge pull request #16581 from [BEAM-12164]: Add

[noreply] Fix labeler trigger (#16674)

[noreply] [BEAM-13781] Exclude grpc-netty-shaded from gax-grpc's dependency

[noreply] [BEAM-13051] Fixed pylint warnings : raising-non-exception (E0710),

[noreply] [BEAM-13740] Correctly install go before running tests (#16673)

[noreply] [BEAM-12830] Update local Docker env Go version. (#16670)

[noreply] [BEAM-13051][B] Enable pylint warnings

[noreply] [BEAM-13430] Revert Spark libraries in Spark runner to provided (#16675)

[noreply] [BEAM-12240] Add Java 17 support (#16568)

[noreply] [BEAM-13760] Add random component to default python dataflow job name


------------------------------------------
[...truncated 96.01 KB...]
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 1418, in connect
    super().connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1397, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for spotbugs-annotations-4.0.6: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for jFormatString-3.0.0: http://www.gnu.org/licenses/lgpl.html. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 1418, in connect
    super().connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1397, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for spotbugs-annotations-4.0.6: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html after 9 retries.
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for jFormatString-3.0.0: http://www.gnu.org/licenses/lgpl.html after 9 retries.
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1256, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 951, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 922, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt after 9 retries.
ERROR:root:['spotbugs-annotations-4.0.6', 'jFormatString-3.0.0', 'checkstyle-8.23']
ERROR:root:**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]
INFO:root:pull_licenses_java.py failed. It took 1226.370358 seconds with 16 threads.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 314, in <module>
    raise RuntimeError('{n} error(s) occurred.'.format(n=len(error_msg)),
RuntimeError: ('1 error(s) occurred.', ['**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]'])

> Task :sdks:java:container:pullLicenses FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 21m 12s
103 actionable tasks: 66 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/uajqb2awvk6yk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #228

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/228/display/redirect?page=changes>

Changes:

[mrudary] Generalize S3FileSystem to support multiple URI schemes.


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e638c1183407999bca3e8e4987119c0d9158d00d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e638c1183407999bca3e8e4987119c0d9158d00d # timeout=10
Commit message: "Merge pull request #16607: [BEAM-13245] Generalize S3FileSystem to support multiple URI schemes."
 > git rev-list --no-walk 6c9c208197d3d74b1c3643d22716ad3b00213506 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11] $ /bin/bash -xe /tmp/jenkins4689547643951386454.sh
+ echo '*** Load test: CoGBK 2GB 100  byte records - single key ***'
*** Load test: CoGBK 2GB 100  byte records - single key ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest -Prunner=:runners:google-cloud-dataflow-java '-PloadTest.args=--project=apache-beam-testing --region=us-central1 --appName=load_tests_Java11_Dataflow_V2_streaming_CoGBK_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --influxMeasurement=java_streaming_cogbk_1 --influxTags={"runnerVersion":"v2","jdk":"java11"} --publishToInfluxDB=true --sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1} --coSourceOptions={"numRecords":2000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1000} --iterations=1 --numWorkers=5 --autoscalingAlgorithm=NONE --streaming=true --influxDatabase=beam_test_metrics --influxHost=http://10.128.0.96:8086 --inputWindowDurationSec=1200 --coInputWindowDurationSec=1200 --runner=DataflowRunner' -Prunner.version=V2 -PcompileAndRunTestsWithJava11 -Pjava11Home=/usr/lib/jvm/java-11-openjdk-amd64 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

> Configure project :sdks:go:test
System Go installation: /snap/bin/go is go version go1.16.13 linux/amd64; Preparing to use /home/jenkins/go/bin/go1.16.12
go1.16.12: already downloaded in /home/jenkins/sdk/go1.16.12
GOCMD=/home/jenkins/go/bin/go1.16.12

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/testing/load-tests/build.gradle'> line: 113

* What went wrong:
A problem occurred evaluating project ':sdks:java:testing:load-tests'.
> Could not get unknown property 'dockerImageName' for project ':runners:google-cloud-dataflow-java' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 18s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4ixgjqo44yyqm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #227

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/227/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6c9c208197d3d74b1c3643d22716ad3b00213506 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6c9c208197d3d74b1c3643d22716ad3b00213506 # timeout=10
Commit message: "[BEAM-10206] Add Go Vet to Github Actions (#16612)"
 > git rev-list --no-walk 6c9c208197d3d74b1c3643d22716ad3b00213506 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11] $ /bin/bash -xe /tmp/jenkins415292196793121541.sh
+ echo '*** Load test: CoGBK 2GB 100  byte records - single key ***'
*** Load test: CoGBK 2GB 100  byte records - single key ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest -Prunner=:runners:google-cloud-dataflow-java '-PloadTest.args=--project=apache-beam-testing --region=us-central1 --appName=load_tests_Java11_Dataflow_V2_streaming_CoGBK_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --influxMeasurement=java_streaming_cogbk_1 --influxTags={"runnerVersion":"v2","jdk":"java11"} --publishToInfluxDB=true --sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1} --coSourceOptions={"numRecords":2000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1000} --iterations=1 --numWorkers=5 --autoscalingAlgorithm=NONE --streaming=true --influxDatabase=beam_test_metrics --influxHost=http://10.128.0.96:8086 --inputWindowDurationSec=1200 --coInputWindowDurationSec=1200 --runner=DataflowRunner' -Prunner.version=V2 -PcompileAndRunTestsWithJava11 -Pjava11Home=/usr/lib/jvm/java-11-openjdk-amd64 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

> Configure project :sdks:go:test
System Go installation: /snap/bin/go is go version go1.16.13 linux/amd64; Preparing to use /home/jenkins/go/bin/go1.16.12
go1.16.12: already downloaded in /home/jenkins/sdk/go1.16.12
GOCMD=/home/jenkins/go/bin/go1.16.12

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/testing/load-tests/build.gradle'> line: 113

* What went wrong:
A problem occurred evaluating project ':sdks:java:testing:load-tests'.
> Could not get unknown property 'dockerImageName' for project ':runners:google-cloud-dataflow-java' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 13s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/44darudcfdm3c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #226

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/226/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-13751] Don't block on gcloud when attempting to get default GCP

[Kyle Weaver] [BEAM-13751] Parameterize wait timeout so test doesn't waste 2s.

[Kyle Weaver] [BEAM-13751] Add comment explaining sleep.

[noreply] Update Python SDK beam-master tags (#16630)

[noreply] Merge pull request #16592 from [BEAM-13722][Playground] Add precompiling

[noreply] Merge pull request #16505 from [BEAM-13527] [Playground] Pipeline

[noreply] [BEAM-13293] XLang Jdbc IO for Go SDK (#16111)

[noreply] [BEAM-10206] Add Go Vet to Github Actions (#16612)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6c9c208197d3d74b1c3643d22716ad3b00213506 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6c9c208197d3d74b1c3643d22716ad3b00213506 # timeout=10
Commit message: "[BEAM-10206] Add Go Vet to Github Actions (#16612)"
 > git rev-list --no-walk 178cb7b65401d860b63fc7415f02f1dea2c4582f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11] $ /bin/bash -xe /tmp/jenkins1855643352193347518.sh
+ echo '*** Load test: CoGBK 2GB 100  byte records - single key ***'
*** Load test: CoGBK 2GB 100  byte records - single key ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest -Prunner=:runners:google-cloud-dataflow-java '-PloadTest.args=--project=apache-beam-testing --region=us-central1 --appName=load_tests_Java11_Dataflow_V2_streaming_CoGBK_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --influxMeasurement=java_streaming_cogbk_1 --influxTags={"runnerVersion":"v2","jdk":"java11"} --publishToInfluxDB=true --sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1} --coSourceOptions={"numRecords":2000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1000} --iterations=1 --numWorkers=5 --autoscalingAlgorithm=NONE --streaming=true --influxDatabase=beam_test_metrics --influxHost=http://10.128.0.96:8086 --inputWindowDurationSec=1200 --coInputWindowDurationSec=1200 --runner=DataflowRunner' -Prunner.version=V2 -PcompileAndRunTestsWithJava11 -Pjava11Home=/usr/lib/jvm/java-11-openjdk-amd64 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

> Configure project :sdks:go:test
System Go installation: /snap/bin/go is go version go1.16.13 linux/amd64; Preparing to use /home/jenkins/go/bin/go1.16.12
go1.16.12: already downloaded in /home/jenkins/sdk/go1.16.12
GOCMD=/home/jenkins/go/bin/go1.16.12

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/testing/load-tests/build.gradle'> line: 113

* What went wrong:
A problem occurred evaluating project ':sdks:java:testing:load-tests'.
> Could not get unknown property 'dockerImageName' for project ':runners:google-cloud-dataflow-java' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 14s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/oigweayvbqq66

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #225

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/225/display/redirect?page=changes>

Changes:

[marcin.kuthan] Get rid of unnessecary logs for BigQuery streaming writes in

[dhuntsperger] added GitHub example references to Python multilang quickstart

[mmack] [adhoc] Test S3Options and AwsOptions for Sdk v2

[noreply] [BEAM-13537] Fix NPE in kafkatopubsub example (#16625)

[noreply] [BEAM-13740] update java_tests.yml to remove setup-go, which is

[Heejong Lee] Fix google3 import error

[noreply] [BEAM-12976] Implement Java projection pushdown optimizer. (#16513)

[noreply] Merge pull request #16579 from Revert "Revert "Merge pull request #15863

[noreply] Merge pull request #16606 from [BEAM-13247] [Playground] Embedding


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 178cb7b65401d860b63fc7415f02f1dea2c4582f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 178cb7b65401d860b63fc7415f02f1dea2c4582f # timeout=10
Commit message: "Merge pull request #16606 from [BEAM-13247] [Playground] Embedding iframe"
 > git rev-list --no-walk f687ece82d3623062fb1ab0f7b3e1366638ad867 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11] $ /bin/bash -xe /tmp/jenkins321842524113068946.sh
+ echo '*** Load test: CoGBK 2GB 100  byte records - single key ***'
*** Load test: CoGBK 2GB 100  byte records - single key ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest -Prunner=:runners:google-cloud-dataflow-java '-PloadTest.args=--project=apache-beam-testing --region=us-central1 --appName=load_tests_Java11_Dataflow_V2_streaming_CoGBK_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --influxMeasurement=java_streaming_cogbk_1 --influxTags={"runnerVersion":"v2","jdk":"java11"} --publishToInfluxDB=true --sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1} --coSourceOptions={"numRecords":2000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1000} --iterations=1 --numWorkers=5 --autoscalingAlgorithm=NONE --streaming=true --influxDatabase=beam_test_metrics --influxHost=http://10.128.0.96:8086 --inputWindowDurationSec=1200 --coInputWindowDurationSec=1200 --runner=DataflowRunner' -Prunner.version=V2 -PcompileAndRunTestsWithJava11 -Pjava11Home=/usr/lib/jvm/java-11-openjdk-amd64 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

> Configure project :sdks:go:test
System Go installation: /usr/bin/go is go version go1.16.13 linux/amd64; Preparing to use /home/jenkins/go/bin/go1.16.12
go1.16.12: already downloaded in /home/jenkins/sdk/go1.16.12
GOCMD=/home/jenkins/go/bin/go1.16.12

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/testing/load-tests/build.gradle'> line: 113

* What went wrong:
A problem occurred evaluating project ':sdks:java:testing:load-tests'.
> Could not get unknown property 'dockerImageName' for project ':runners:google-cloud-dataflow-java' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 15s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gaqq5xefmbvpi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #224

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/224/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-13093] Enable JavaUsingPython CrossLanguageValidateRunner test for

[mmack] [BEAM-13746] Fix deserialization of SSECustomerKey for AWS Sdk v2

[noreply] [BEAM-7928] Allow users to specify worker disk type for Dataflow runner

[noreply] Merge pull request #16534 from [BEAM-13671][Playground] Add backend

[noreply] [BEAM-13271] Bump errorprone to 2.10.0 (#16231)

[noreply] [BEAM-13595] Don't load main session when cloudpickle is used. (#16589)

[Heejong Lee] Update readme for XVR tests


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-5 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f687ece82d3623062fb1ab0f7b3e1366638ad867 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f687ece82d3623062fb1ab0f7b3e1366638ad867 # timeout=10
Commit message: "Merge pull request #16626 from ihji/update_readme"
 > git rev-list --no-walk 41d585f82b10195f758d14e3a54076ea1f05aa75 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11] $ /bin/bash -xe /tmp/jenkins3078908429813086756.sh
+ echo '*** Load test: CoGBK 2GB 100  byte records - single key ***'
*** Load test: CoGBK 2GB 100  byte records - single key ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest -Prunner=:runners:google-cloud-dataflow-java '-PloadTest.args=--project=apache-beam-testing --region=us-central1 --appName=load_tests_Java11_Dataflow_V2_streaming_CoGBK_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --influxMeasurement=java_streaming_cogbk_1 --influxTags={"runnerVersion":"v2","jdk":"java11"} --publishToInfluxDB=true --sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1} --coSourceOptions={"numRecords":2000000,"keySizeBytes":10,"valueSizeBytes":90,"numHotKeys":1000} --iterations=1 --numWorkers=5 --autoscalingAlgorithm=NONE --streaming=true --influxDatabase=beam_test_metrics --influxHost=http://10.128.0.96:8086 --inputWindowDurationSec=1200 --coInputWindowDurationSec=1200 --runner=DataflowRunner' -Prunner.version=V2 -PcompileAndRunTestsWithJava11 -Pjava11Home=/usr/lib/jvm/java-11-openjdk-amd64 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

> Configure project :sdks:go:test
System Go installation: /snap/bin/go is go version go1.16.13 linux/amd64; Preparing to use /home/jenkins/go/bin/go1.16.12
go1.16.12: already downloaded in /home/jenkins/sdk/go1.16.12
GOCMD=/home/jenkins/go/bin/go1.16.12

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/testing/load-tests/build.gradle'> line: 113

* What went wrong:
A problem occurred evaluating project ':sdks:java:testing:load-tests'.
> Could not get unknown property 'dockerImageName' for project ':runners:google-cloud-dataflow-java' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 15s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vb2eyhfand7pm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #223

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/223/display/redirect?page=changes>

Changes:

[artur.khanin] Privacy policy update regarding Apache Beam Playground

[Daniel Oliveira] [BEAM-13321] Fix exception with BigQuery StreamWriter TraceID.

[mmack] [BEAM-8807] Add integration test for SnsIO.write (Sdk v1 & v2)

[noreply] [BEAM-13736] Make lifting cache exact. (#16603)

[noreply] Merge pull request #16565 from [BEAM-13692][Playground]  Implement

[noreply] Merge pull request #16502 from [BEAM-13650][Playground] Add link for

[noreply] [BEAM-13310] remove call to get offset consumer config, which was rep…


------------------------------------------
[...truncated 51.41 KB...]
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 26, 2022 12:46:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 26, 2022 12:46:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds
Jan 26, 2022 12:46:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 26, 2022 12:46:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114075 bytes, hash 2a52c358ab6afb015385470d26cfb618c296309cb5c727e96afee5f468d69acb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KlLDWKtq-wFThUcNJs-2GMKWMJy1xyfpav7l9GjWmss.pb
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 26, 2022 12:46:13 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b]
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 26, 2022 12:46:13 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03]
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 26, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 26, 2022 12:46:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-26_04_46_13-13297979137814239911?project=apache-beam-testing
Jan 26, 2022 12:46:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-26_04_46_13-13297979137814239911
Jan 26, 2022 12:46:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-26_04_46_13-13297979137814239911
Jan 26, 2022 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-26T12:46:20.704Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-c0rs. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 26, 2022 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.011Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.443Z: Expanding SplittableParDo operations into optimizable parts.
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.494Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.562Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.647Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.683Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.746Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.857Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.880Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.911Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.945Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:25.977Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.024Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.052Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.077Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.111Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.143Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.166Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.198Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.235Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.267Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.300Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.334Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.368Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.394Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.419Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.450Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.497Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.520Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.553Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 26, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:26.914Z: Starting 5 ****s in us-central1-b...
Jan 26, 2022 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:46:55.412Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 26, 2022 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:47:15.413Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 26, 2022 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:48:13.892Z: Workers have started successfully.
Jan 26, 2022 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T12:48:13.920Z: Workers have started successfully.
Jan 26, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T16:00:34.523Z: Cancel request is committed for workflow job: 2022-01-26_04_46_13-13297979137814239911.
Jan 26, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T16:00:34.606Z: Cleaning up.
Jan 26, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T16:00:34.677Z: Stopping **** pool...
Jan 26, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T16:00:34.732Z: Stopping **** pool...
Jan 26, 2022 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T16:02:59.898Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 26, 2022 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-26T16:02:59.936Z: Worker pool stopped.
Jan 26, 2022 4:03:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-26_04_46_13-13297979137814239911 finished with status CANCELLED.
Load test results for test (ID): 6278227b-1960-46af-85ae-97f03b8b4c29 and timestamp: 2022-01-26T12:46:07.536000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11427.155
dataflow_v2_java11_total_bytes_count             2.08754957E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220126124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1eac405e6df8d224c616f83211c43eaeed5b86646f4dd0756ea0b82b8e196b3e
Deleted: sha256:9eb22c9566ce29161db940201866b381d4cfedcfadf02e7fb19608974f358eb7
Deleted: sha256:ab0e4583e6362d9fe7fb14f0df2436a45c703cbcda7231018cfe4e220ade0408
Deleted: sha256:12a9587fd3e513dc7e7a3fe943f72cd5df0dfc8ec8bdafcf6b1c199a6d54ac6a
Deleted: sha256:d6b51961f5cc54f863c8ec9f767081d334a09d4d61ab010f1c7816009e67f9e1
Deleted: sha256:700e4b274581b80b68722a5793a0179911697a48ddcee8099aa3a4f5bfd9b038
Deleted: sha256:1d9167783adb49dc0dcd57734f8fe0e08a324e7c7328c17e19a5ab3d93b7d77a
Deleted: sha256:2abd4cdffdfa74df65ea4a49f892326a65646b73484c5a39c48fa8b7911f16a1
Deleted: sha256:81140e570eb0faa7972b47c473cfcd46bccfb62c0669ebf1ec5e56e10c173768
Deleted: sha256:954f5e699208af0d61ef61f0ba5085ae1b7d6505b43d7d925dfadeb65d81e0b6
Deleted: sha256:fbd0feea80516223cfbff3e5e805f96236b1d9e14cc96ca7576d7ed388fa6ca2
Deleted: sha256:582e09d9f53dde9911134431821e898c60b972ec3c0b26663a3119885aed8489
Deleted: sha256:58084b5bb8e5382b7061da7146330e598313fb3131c9e87c20bc990e3c4f082e
Deleted: sha256:b989466baf5b1445f3d90b1f3b4d88cf1423860e106e5e85decec9428ced1b87
Deleted: sha256:203e4d209b4ec63c167cb8c5749d977f3a6c71dfb84b21fd6f2593d788ca3816
Deleted: sha256:f6f10a620a64d30963cbaf4888574ff79cce7e69d75f1a71bf9f00730acddd47
Deleted: sha256:ffd55aa6f7e45b8f943095b3e745e03cd4b6777f65f7ed49c9bf28eaa870af2f
Deleted: sha256:8af93658b6de747b2fdf2e3b956d3da5a00a833aefcfee84fce816aa8ba5d302
Deleted: sha256:0dade9d6e684b8b3191fd8209e6061b0f9a62800934311e3692d018e594a2883
Deleted: sha256:76f9f9d0271b9514526a70aaa3025c10ee9c5b6a5d856019b670cc9b587fab3f
Deleted: sha256:eb9b1d2640653af3426feae0bda64acb05e15ea318bc52c5ef724c3821908591
Deleted: sha256:e1bfbf9615f64429e0a507424cce9d14e45de72faa231a0b4917fd8f89bb2a1a
Deleted: sha256:5881013990107ab03b4b0203e40f0263d1ae639a1d5f68dd611cffc3a0a7e48d
Deleted: sha256:9bd84a28ab6f85af099c5781b18ead896dd53cb11e2c16bdc7aa95b427f2c0fe
Deleted: sha256:70c0ce5a7d406fda4fe5829e6429f9b57722fbe0afa0a5bf6eda90db1fd76f2b
Deleted: sha256:818617bb409153ff930968176f51ad6091a33fa52070657a1feb976662ae54f0
Deleted: sha256:d79a9d294b47df08825061ad0c89c65543f7f9f97bd4465f3aed01e96aededea
Deleted: sha256:df2fd65bfcb4307f90bb9fdf9613c3f24f5ea480c9b96558e46d7398c1997066
Deleted: sha256:2bf13fa4ccd639fa8d5e4ef69084a865d8431729baa6deceaddfbbf93be05a88
Deleted: sha256:2cbf787ec6a5c43d9fc3339986daf9f0aa8194268c5a4f3fbfee71f6fc82abba
Deleted: sha256:5c2fae66fbfca2ede4897e8ba8ca5fa96137e0e187d56a7937a04229d3d09582
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220126124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1eac405e6df8d224c616f83211c43eaeed5b86646f4dd0756ea0b82b8e196b3e]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220126124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1eac405e6df8d224c616f83211c43eaeed5b86646f4dd0756ea0b82b8e196b3e])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1eac405e6df8d224c616f83211c43eaeed5b86646f4dd0756ea0b82b8e196b3e
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1eac405e6df8d224c616f83211c43eaeed5b86646f4dd0756ea0b82b8e196b3e
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1eac405e6df8d224c616f83211c43eaeed5b86646f4dd0756ea0b82b8e196b3e].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c07e8f47c4537b9f91d3814dfe4b4fbed465cd40fe06f32257aed502f7fb4bb6
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c07e8f47c4537b9f91d3814dfe4b4fbed465cd40fe06f32257aed502f7fb4bb6
ERROR: (gcloud.container.images.delete) Not found: response: {'status': '404', 'content-length': '168', 'x-xss-protection': '0', 'transfer-encoding': 'chunked', 'server': 'Docker Registry', '-content-encoding': 'gzip', 'docker-distribution-api-version': 'registry/2.0', 'cache-control': 'private', 'date': 'Wed, 26 Jan 2022 16:03:10 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
Failed to compute blob liveness for manifest: 'sha256:c07e8f47c4537b9f91d3814dfe4b4fbed465cd40fe06f32257aed502f7fb4bb6': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 290

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 49s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/5umerf77ijq56

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #222

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/222/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-13716] Clear before creating a new virtual environment in

[mmack] [BEAM-13653] Make SnsIO.write topicArn optional. If provided, validate

[noreply] [BEAM-10897] Update the fastavro lower bound due to an issue on Windows

[noreply] [BEAM-13605] Update pandas_doctests_test denylists in preparation for

[noreply] Merge pull request #16538 from [BEAM-13676][Playground][Bugfix]Build Of

[noreply] Merge pull request #16582 from [BEAM-13711] [Playground] [Bugfix] Add

[noreply] Merge pull request #16515 from [BEAM-13636] [Playground] Checking the

[ningkang0957] [BEAM-13275] Removed the explicit selenium dependency from setup

[noreply] [BEAM-10206] Deprecate unused shallow cloning functions (#16600)

[noreply] Bump Dataflow container versions (#16602)

[noreply] Improved multi-language pipelines section of the programming guide

[mmack] [BEAM-13510] Don't retry on invalid SQS receipt handles.


------------------------------------------
[...truncated 53.48 KB...]
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30ca0779, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58740366]
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 25, 2022 12:46:13 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2]
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 25, 2022 12:46:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 25, 2022 12:46:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-25_04_46_13-6663904842635572357?project=apache-beam-testing
Jan 25, 2022 12:46:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-25_04_46_13-6663904842635572357
Jan 25, 2022 12:46:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-25_04_46_13-6663904842635572357
Jan 25, 2022 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-25T12:46:21.757Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-qyct. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 25, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:25.608Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 25, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.253Z: Expanding SplittableParDo operations into optimizable parts.
Jan 25, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.288Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 25, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.350Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 25, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.406Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 25, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.434Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 25, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.513Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.636Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.673Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.704Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.738Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.777Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.810Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.856Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.881Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.914Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.953Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:26.991Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.043Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.079Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.113Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.148Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.184Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.226Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.255Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.292Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.318Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.348Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.382Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.465Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 25, 2022 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:27.811Z: Starting 5 ****s in us-central1-b...
Jan 25, 2022 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:46:36.867Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 25, 2022 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:47:12.308Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 25, 2022 12:48:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:48:13.834Z: Workers have started successfully.
Jan 25, 2022 12:48:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T12:48:13.864Z: Workers have started successfully.
Jan 25, 2022 12:57:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-25T12:57:12.740Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
Jan 25, 2022 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T16:00:26.302Z: Cancel request is committed for workflow job: 2022-01-25_04_46_13-6663904842635572357.
Jan 25, 2022 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T16:00:26.511Z: Cleaning up.
Jan 25, 2022 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T16:00:26.581Z: Stopping **** pool...
Jan 25, 2022 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T16:00:26.633Z: Stopping **** pool...
Jan 25, 2022 4:02:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T16:02:41.654Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 25, 2022 4:02:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-25T16:02:41.686Z: Worker pool stopped.
Jan 25, 2022 4:02:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-25_04_46_13-6663904842635572357 finished with status CANCELLED.
Load test results for test (ID): 92761e44-82f7-4e56-8992-aea01f9be228 and timestamp: 2022-01-25T12:46:07.711000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11446.062
dataflow_v2_java11_total_bytes_count             1.68872296E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220125124338
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a4047dda4444d67d6a79d9e4bba2dc5e29b61bfef6cfed4ca686031033c6c190
Deleted: sha256:9babd310ad459a08379a87217c495ab375b22d0707ff1ebe868036b9c3292b05
Deleted: sha256:fdcdbd43456f38101571bb73e7bfcaee4b829b9e2a68f2532393f595b6fe6b79
Deleted: sha256:4424635fdc69d5ae2fb391b25a1c1e423160dd42a25cfb1ba187825910bff23b
Deleted: sha256:3d4686a86ced7cfaaad797fcaba0f3ad6a8901c167e7ca4a48377910957c133d
Deleted: sha256:3d21f2ee0e4aa3d5f0b74bf962f4e858d405837fab8e1b6d1b26715b468930c3
Deleted: sha256:525e159289e730cd9adbd2f2157de76a010ef0c43df60ba954ffd55ae3437bc0
Deleted: sha256:69d76b712a50feb163b9d8df9aeff5ab8103821ae7f7810a47305b606d13eb60
Deleted: sha256:ac033b1867d44807d556004484da2037edebcfade0bbb88c820211f8039d5019
Deleted: sha256:6508accac77e7a1c89ec7bd6bf55fb27a6cea46802da8617cacbb9766118c1da
Deleted: sha256:c735ff24a9b7c07eb90da151024c50349187dd5d2612ccf568f8957b30c8e613
Deleted: sha256:5fdf57413c5acf1b9d9040629c40f8a6b642929762d46015edfc6809fc907875
Deleted: sha256:7ec8d1d04af15dd280804054233d9f7259975a79610c9b8eb443c65f91a1c6ba
Deleted: sha256:66ab16f9abd991c18e7a0fa1cddde8e7dbe22cfb12abd7c79cf747895d62216a
Deleted: sha256:acff4ebd6afa616703d982051100bb81097e3d3cb0e1161232c90e3797132d00
Deleted: sha256:82312c4331d2f775f9cee595ec24855ba7d4c03b1e495cd4db950146e9255cb3
Deleted: sha256:f0635a4b22b34abcc55ccb1a0d36a90bc09bb1ac4fe4eb794786dc364a5b1a8e
Deleted: sha256:517d460cfd6f455ef29e5aa1275861751ebf43eaf3f20de9600310f959a2b1b1
Deleted: sha256:cde7f1dc4c3d74430a6722aaf1b809fc9982fc950097e88fc0f2fc6342c98c49
Deleted: sha256:560cf8e48e29871552f76cc0186865d0e572b0a66260d1894dab6fcfeeab8347
Deleted: sha256:6c299191677a6afff0f99936e7c00d29680f0d277b3aa9eb925b9ed013df1de4
Deleted: sha256:b850cae7484e4c0526121c5b249c193472178d2c3c9c114eb260e9f948f4efc3
Deleted: sha256:522c983e836c163aa2fb8248b14702e35030ef3da0088fc41104440ef5f6f8ac
Deleted: sha256:d121b241bd4c7470989736c007f1318847c02f920835aa78f5fdef93a177ebc3
Deleted: sha256:076a1057f8ceb9a28bcd361fc0c15b19b322b8c973ac9798896105cb3356755f
Deleted: sha256:18d2f345ead2da5d9ecc0effd0c73ad5b3009f3d2fd853989f4223cc9f4c1845
Deleted: sha256:f4949d04a5d638756cc60880f8a2b9f0a993138bc536d5b74fb1d992cc80569c
Deleted: sha256:2210480836fbe9670830e279fe73cd4387e21ce3862d7a17469d81729bacb8ca
Deleted: sha256:5a03267d766826dbb5a4ac2898563a949356b4ba6e421967ad8f2c92dd4b833a
Deleted: sha256:058367e555d4e5c16f4d3ddb7a5fd117744ada6d5f937d18b07ec4a661590351
Deleted: sha256:96828d0f54da6dbacde273408bede56721aee594b5234a4d659b1bf3f094e257
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220125124338]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a4047dda4444d67d6a79d9e4bba2dc5e29b61bfef6cfed4ca686031033c6c190]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220125124338] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a4047dda4444d67d6a79d9e4bba2dc5e29b61bfef6cfed4ca686031033c6c190])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a4047dda4444d67d6a79d9e4bba2dc5e29b61bfef6cfed4ca686031033c6c190
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a4047dda4444d67d6a79d9e4bba2dc5e29b61bfef6cfed4ca686031033c6c190
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a4047dda4444d67d6a79d9e4bba2dc5e29b61bfef6cfed4ca686031033c6c190].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 31s
109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/qkqi7i2oavhqs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #221

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/221/display/redirect>

Changes:


------------------------------------------
[...truncated 46.88 KB...]
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
08881f8e5634: Preparing
3295642dd35d: Preparing
7f300e65609e: Preparing
33a2eb14bd4a: Preparing
3afd30fba654: Preparing
d2f1cdf0d861: Preparing
cd71ad19f1c7: Preparing
d5fb0eb74494: Preparing
4e28ff0d6619: Preparing
488f8b6417f7: Preparing
10732ed4a521: Preparing
8d4146ffe202: Preparing
17482a157a9b: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
d5fb0eb74494: Waiting
cd71ad19f1c7: Waiting
26a504e63be4: Preparing
8bf42db0de72: Preparing
4e28ff0d6619: Waiting
488f8b6417f7: Waiting
10732ed4a521: Waiting
31892cc314cb: Preparing
11936051f93b: Preparing
17482a157a9b: Waiting
3bb5258f46d2: Waiting
26a504e63be4: Waiting
832e177bb500: Waiting
f9e18e59a565: Waiting
8d4146ffe202: Waiting
11936051f93b: Waiting
8bf42db0de72: Waiting
3afd30fba654: Pushed
3295642dd35d: Pushed
7f300e65609e: Pushed
33a2eb14bd4a: Pushed
d2f1cdf0d861: Pushed
08881f8e5634: Pushed
d5fb0eb74494: Pushed
488f8b6417f7: Pushed
cd71ad19f1c7: Pushed
3bb5258f46d2: Layer already exists
8d4146ffe202: Pushed
f9e18e59a565: Layer already exists
832e177bb500: Layer already exists
26a504e63be4: Layer already exists
4e28ff0d6619: Pushed
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
17482a157a9b: Pushed
10732ed4a521: Pushed
20220124124331: digest: sha256:389ec7cdac55c8dcc0448eea26d529f42361cd80a48e2ecfd4a8ad52b23834df size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 24, 2022 12:45:29 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 24, 2022 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Jan 24, 2022 12:45:30 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 24, 2022 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 24, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 24, 2022 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds
Jan 24, 2022 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 24, 2022 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114075 bytes, hash de8261c71ef455048cd0e681b1f83d1ee90c895c5bafd4e9ec89f89a0714352b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3oJhxx70VQSM0OaBsfg9HukMiVxbr9Tp7In4mgcUNSs.pb
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 24, 2022 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b29496, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1]
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 24, 2022 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 24, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 24, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-24_04_45_35-445355469944687180?project=apache-beam-testing
Jan 24, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-24_04_45_35-445355469944687180
Jan 24, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-24_04_45_35-445355469944687180
Jan 24, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-24T12:45:43.051Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-mm39. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:48.178Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.048Z: Expanding SplittableParDo operations into optimizable parts.
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.110Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.175Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.246Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.289Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.345Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.452Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.471Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.498Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.522Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.550Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.588Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.610Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 24, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.637Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.671Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.694Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.717Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.748Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.781Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.814Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.843Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.873Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.907Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.928Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.961Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:49.992Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:50.026Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:50.050Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:50.071Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 24, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:45:50.417Z: Starting 5 ****s in us-central1-b...
Jan 24, 2022 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:46:10.082Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 24, 2022 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:46:35.661Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 24, 2022 12:47:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:47:34.434Z: Workers have started successfully.
Jan 24, 2022 12:47:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-24T12:47:34.468Z: Workers have started successfully.
Jan 24, 2022 2:56:19 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Build timed out (after 240 minutes). Marking the build as aborted.
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel "hudson.remoting.Channel@3caf42b8:apache-beam-jenkins-7": Remote call on apache-beam-jenkins-7 failed. The channel is closing down or has closed down
	at hudson.remoting.Channel.call(Channel.java:994)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
	at com.sun.proxy.$Proxy123.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1211)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1203)
	at hudson.Launcher$ProcStarter.join(Launcher.java:523)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:803)
	at hudson.model.Build$BuildExecution.build(Build.java:197)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:513)
	at hudson.model.Run.execute(Run.java:1906)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.io.IOException
	at hudson.remoting.Channel.close(Channel.java:1499)
	at hudson.remoting.Channel.close(Channel.java:1455)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:884)
	at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:110)
	at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:765)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-7 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #220

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/220/display/redirect>

Changes:


------------------------------------------
[...truncated 49.71 KB...]
92bdfc681768: Preparing
31db193b5341: Preparing
2b516344936b: Preparing
a382a235a51a: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
3bb5258f46d2: Waiting
a382a235a51a: Waiting
832e177bb500: Waiting
92bdfc681768: Waiting
f9e18e59a565: Waiting
31db193b5341: Waiting
8b832f1350cf: Waiting
26a504e63be4: Waiting
cc7c978d3013: Waiting
2b516344936b: Waiting
cf73a7ff016b: Waiting
8bf42db0de72: Waiting
f474f14b1fc4: Waiting
31892cc314cb: Waiting
dd53e43265f7: Pushed
22c5323e4404: Pushed
d0b4d25fddcf: Pushed
d7c9d110ad4a: Pushed
cc7c978d3013: Pushed
8b832f1350cf: Pushed
df9fa634f6bf: Pushed
f474f14b1fc4: Pushed
92bdfc681768: Pushed
cf73a7ff016b: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
11936051f93b: Layer already exists
31892cc314cb: Layer already exists
2b516344936b: Pushed
a382a235a51a: Pushed
31db193b5341: Pushed
20220123124334: digest: sha256:327a5207d0536275f7c50fef32e64cdb7f68d18058aeaaa2fd8607d13c7e2f90 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 23, 2022 12:45:31 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 23, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Jan 23, 2022 12:45:32 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 23, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 23, 2022 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 23, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds
Jan 23, 2022 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 23, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114075 bytes, hash 2bcdc864d0594827cda6dea9091a81e96e9e861225765bb2408e31a0528aa236> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-K83IZNBZSCfNpt6pCRqB6W6ehhIldluyQI4xoFKKojY.pb
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 23, 2022 12:45:37 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b29496, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1]
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 23, 2022 12:45:37 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 23, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 23, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 23, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 23, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 23, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 23, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 23, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-23_04_45_38-9689623812230593792?project=apache-beam-testing
Jan 23, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-23_04_45_38-9689623812230593792
Jan 23, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-23_04_45_38-9689623812230593792
Jan 23, 2022 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-23T12:45:47.726Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-idef. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 23, 2022 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:51.411Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 23, 2022 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.008Z: Expanding SplittableParDo operations into optimizable parts.
Jan 23, 2022 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.034Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 23, 2022 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.106Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.153Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.173Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.229Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.317Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.340Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.361Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.386Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.414Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.475Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.508Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.542Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.571Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.591Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.624Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.657Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.690Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.728Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.758Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.786Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.816Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.842Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.899Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.923Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.949Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:52.970Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:53.002Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 23, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:45:53.345Z: Starting 5 ****s in us-central1-b...
Jan 23, 2022 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:46:18.525Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 23, 2022 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:46:38.474Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 23, 2022 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:47:41.435Z: Workers have started successfully.
Jan 23, 2022 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T12:47:41.452Z: Workers have started successfully.
Jan 23, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T16:00:34.846Z: Cancel request is committed for workflow job: 2022-01-23_04_45_38-9689623812230593792.
Jan 23, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T16:00:34.903Z: Cleaning up.
Jan 23, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T16:00:34.964Z: Stopping **** pool...
Jan 23, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T16:00:35.010Z: Stopping **** pool...
Jan 23, 2022 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T16:02:54.678Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 23, 2022 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-23T16:02:54.726Z: Worker pool stopped.
Jan 23, 2022 4:03:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-23_04_45_38-9689623812230593792 finished with status CANCELLED.
Load test results for test (ID): b1a6fda8-8a23-47d9-8fd0-6353ad4ad2f7 and timestamp: 2022-01-23T12:45:32.517000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11550.603
dataflow_v2_java11_total_bytes_count             2.14111241E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220123124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:327a5207d0536275f7c50fef32e64cdb7f68d18058aeaaa2fd8607d13c7e2f90
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220123124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:327a5207d0536275f7c50fef32e64cdb7f68d18058aeaaa2fd8607d13c7e2f90]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220123124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:327a5207d0536275f7c50fef32e64cdb7f68d18058aeaaa2fd8607d13c7e2f90])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:327a5207d0536275f7c50fef32e64cdb7f68d18058aeaaa2fd8607d13c7e2f90
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:327a5207d0536275f7c50fef32e64cdb7f68d18058aeaaa2fd8607d13c7e2f90
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:327a5207d0536275f7c50fef32e64cdb7f68d18058aeaaa2fd8607d13c7e2f90].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 45s
109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/iuwmrkvptyk4s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #219

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/219/display/redirect?page=changes>

Changes:

[ningkang0957] [BEAM-13687] Improved Spanner IO request count metrics

[noreply] [BEAM-10206] Add key for fields in wrapper (#16583)

[noreply] Merge pull request #16530 from Adding JSON support in SpannerIO and

[noreply] [BEAM-13685] Enable users to specify cache directory under Interactive


------------------------------------------
[...truncated 50.04 KB...]
34f0f234ac31: Pushed
a0fb00b6ea3c: Pushed
84ca33d4319b: Pushed
fac0861a38ed: Pushed
4188e5f0ea30: Pushed
af2fd23ee6b3: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
f9e18e59a565: Layer already exists
21938ef367d1: Pushed
900600f37aef: Pushed
20220122124331: digest: sha256:dcf593d4d33dfea412d9e05606500dbea8e7b0ac43ff3325081ceb1b25b66bf1 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 22, 2022 12:45:13 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 22, 2022 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Jan 22, 2022 12:45:14 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 22, 2022 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 22, 2022 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 22, 2022 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 1 seconds
Jan 22, 2022 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 22, 2022 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114075 bytes, hash c56b06723a4b3327eb2ab11df8b2860b8f0ccbb752800051a618d40ddeb9ec2a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xWsGcjpLMyfrKrEd-LKGC48My7dSgABRphjUDd657Co.pb
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 22, 2022 12:45:20 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b]
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 22, 2022 12:45:20 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03]
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 22, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 22, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-22_04_45_20-518053840289819975?project=apache-beam-testing
Jan 22, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-22_04_45_20-518053840289819975
Jan 22, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-22_04_45_20-518053840289819975
Jan 22, 2022 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-22T12:45:27.627Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-jp1v. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:32.405Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:32.996Z: Expanding SplittableParDo operations into optimizable parts.
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.052Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.104Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.170Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.205Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.275Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.550Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.631Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.667Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.769Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.884Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.951Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:33.995Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.052Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.103Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.138Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.184Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.232Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.265Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.300Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.323Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.367Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.403Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.429Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.461Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.509Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.543Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.582Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 22, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:34.626Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 22, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:35.127Z: Starting 5 ****s in us-central1-b...
Jan 22, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:45:45.489Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 22, 2022 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:46:25.811Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 22, 2022 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:47:24.768Z: Workers have started successfully.
Jan 22, 2022 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T12:47:24.806Z: Workers have started successfully.
Jan 22, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T16:00:35.369Z: Cancel request is committed for workflow job: 2022-01-22_04_45_20-518053840289819975.
Jan 22, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T16:00:39.938Z: Cleaning up.
Jan 22, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T16:00:40.015Z: Stopping **** pool...
Jan 22, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T16:00:40.080Z: Stopping **** pool...
Jan 22, 2022 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T16:03:04.360Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 22, 2022 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-22T16:03:04.407Z: Worker pool stopped.
Jan 22, 2022 4:03:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-22_04_45_20-518053840289819975 finished with status CANCELLED.
Load test results for test (ID): a903fa10-e52f-4b2a-87a5-e7b8c96c6fdc and timestamp: 2022-01-22T12:45:14.320000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11575.749
dataflow_v2_java11_total_bytes_count             2.51320278E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220122124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dcf593d4d33dfea412d9e05606500dbea8e7b0ac43ff3325081ceb1b25b66bf1
Deleted: sha256:4e1c80fd78d8b09138a20abf9c5cfdf0d8e49a8eddd852b6e3d4947fd743ec27
Deleted: sha256:6d31a5e58e2cd8f7e01d667ada326056e28da3f291acd7bc09f801cbda8f6dba
Deleted: sha256:5d5030023eca247ffe6553c913858f93c9491f382be7b692813d69a271be049c
Deleted: sha256:c4f1e14f9ff3e942b92e666cb0af292de03f0f1c1bc335acd709dbad468b9782
Deleted: sha256:cae6137207efc69dacf310fb3c2a5b21ede970e90dfbd4920e20b319cf38f68c
Deleted: sha256:9efe7b45d93b361fd6b12b480cfa06ab7f0ad1057d745b5dec1ab7271950ee2d
Deleted: sha256:1cb7a1d13cc5a1c1899d0751b6c160b37b396318a5a035278ef182a4d27192f0
Deleted: sha256:33ee8d2b1e4800d6511dc4a4dceec9a026f5ca0469408ef4ae88d902575eea15
Deleted: sha256:95e61cf0da401382128ef3197cc1f3576bf984e76dcaefb705f6e336e05867e7
Deleted: sha256:81dfd5fe0dc8e23c97868816ed042feb755a3758240724c87ddf44d5b3ff7170
Deleted: sha256:ffebe5549c493a49fcc433a153e727651726a1ad86122dca8b0fe0dc52db4f7a
Deleted: sha256:3caf58e139b1f9da0fc2de3450b5900a29f147b8341bb64f5a9d6f65914ff2cf
Deleted: sha256:652e05429ef1004a8aa0a77e30dc186b42b6226085f96eea30bb33606a4b290f
Deleted: sha256:fc22a56d9cde433936324c8f0b37b01f719b2ecfc410b37635c8a499095ed782
Deleted: sha256:0b4085c97763b09a9dc1380f061f5639a4a64847721d11a61449d410c610b754
Deleted: sha256:d8fef36923094678c7d9f183b3675bdefecf793041940ea80d8f355667ec8216
Deleted: sha256:1de5033d99b370203ccf83433f88fd90e804cfc5561f7a6b1711e9f1fd7d20df
Deleted: sha256:5fa2a78b46def8e9e155a73e8b3704408919bf8ad48027830b3501a5b4692c88
Deleted: sha256:beb2526e33def90ce80897eb255ec38c9321696dc564be1042de8913da9c9950
Deleted: sha256:e236b93c2acda09795b4f7520fa8e4fdca3d1543e149883a9ac37dada4ff1815
Deleted: sha256:695665f1f7700ec705ca7c73a2205b69de6f875c6b1afc355a8529114ce12510
Deleted: sha256:ddbf565541344ac88d437c5ed292b231063c2e65774f580f8fc02dd7852435a8
Deleted: sha256:25362bc602b9b8ecc6c654d0092f431b666cc797b12cff134cd34056a24b8db5
Deleted: sha256:856958f37eff4b4e417638e0e257781a9a9fb4c17b8eec9a6cf31bf01876670b
Deleted: sha256:b0f4d9170e0d9ee2bbbcc214ececc3ecb0a1df999b9abc90551b95f7caac1647
Deleted: sha256:491a6ce71c358747341e3b5176bde3d55b535a0e99e62e3c3a27170e247fa038
Deleted: sha256:dfd1b7d2feeb2eba5147e8651d39bad8333a62120016a9e0aaba255f4c613b5e
Deleted: sha256:681a40589d7c810e61acaec475faf9f4cbed23ca85a11a088a1abba98b3a3a73
Deleted: sha256:952ed8615615addfbc11a18fefff6be16fca4b585e08cf8bce86d42b02238792
Deleted: sha256:a4fc634f88c4e90d0a4e62f22017f71103002977c59d99bfd9308b53c39f34b6
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220122124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dcf593d4d33dfea412d9e05606500dbea8e7b0ac43ff3325081ceb1b25b66bf1]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220122124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dcf593d4d33dfea412d9e05606500dbea8e7b0ac43ff3325081ceb1b25b66bf1])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dcf593d4d33dfea412d9e05606500dbea8e7b0ac43ff3325081ceb1b25b66bf1
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dcf593d4d33dfea412d9e05606500dbea8e7b0ac43ff3325081ceb1b25b66bf1
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dcf593d4d33dfea412d9e05606500dbea8e7b0ac43ff3325081ceb1b25b66bf1].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 55s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ufviu2mjcksc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #218

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/218/display/redirect?page=changes>

Changes:

[thiagotnunes] [BEAM-12164]: Add SDF for reading change stream records

[tuyarer] [BEAM-13577] Beam Select's uniquifyNames function loses nullability of

[sergey.kalinin] Update GH Actions to use proper variables names and proper triggers

[dhuntsperger] edited README and comments in Python multi-lang pipes examples

[Pablo Estrada] Revert "Merge pull request #15863 from [BEAM-13184] Autosharding for

[Pablo Estrada] BEAM-13611 reactivating jdbcio xlang test

[Steve Niemitz] [BEAM-13689] Output token elements when BQ batch writes complete.

[noreply] Merge pull request #16371 from [BEAM-13518][Playground] Beam Playground

[noreply] Update Java FnAPI beam master (#16572)

[noreply] [BEAM-13699] Replace fnv with maphash. (#16573)

[noreply] [BEAM-13693] Bump

[noreply] [BEAM-10206] Remove Fatalf calls in non-test goroutines for

[noreply] [BEAM-13430] Re-add provided configuration (#16552)

[noreply] Merge pull request #16540 from [BEAM-13678][Playground]Update Github

[noreply] Merge pull request #16546 from [BEAM-13661] [BEAM-13704] [Playground]

[noreply] Merge pull request #16369 from [BEAM-13558] [Playground] Hide the Graph


------------------------------------------
[...truncated 49.43 KB...]
1b4f9dff9530: Preparing
dcf309103c81: Preparing
5278327e7316: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
8676a0c81ea4: Waiting
1b4f9dff9530: Waiting
dcf309103c81: Waiting
64ce84ead65a: Waiting
1c8cb9413709: Waiting
c0d2ced337ac: Waiting
5278327e7316: Waiting
26a504e63be4: Waiting
f9e18e59a565: Waiting
832e177bb500: Waiting
3bb5258f46d2: Waiting
31892cc314cb: Waiting
8bf42db0de72: Waiting
11936051f93b: Waiting
b7c0ee70947a: Waiting
e53be5bc9b84: Pushed
69d5eb2f8c50: Pushed
b432987d309f: Pushed
f20f31ec6a4c: Pushed
b7c0ee70947a: Pushed
cef3c17d4e4e: Pushed
c0d2ced337ac: Pushed
64ce84ead65a: Pushed
1c8cb9413709: Pushed
8676a0c81ea4: Pushed
dcf309103c81: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
8bf42db0de72: Layer already exists
5278327e7316: Pushed
1b4f9dff9530: Pushed
20220121124336: digest: sha256:3ec3973dc0b0b6d83932bd6a73291cb4b0c401b28b0e248a92abc8aac20a04d4 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 21, 2022 12:45:20 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 21, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged.
Jan 21, 2022 12:45:21 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 21, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 21, 2022 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 21, 2022 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds
Jan 21, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 21, 2022 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <114075 bytes, hash 4ab60571159b1b7812df15e1bad742c905ef00d0f9251b06e0a5d7047aa8bb0c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SrYFcRWbG3gS3xXhutdCyQXvAND5JRsG4KXXBHqouww.pb
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 21, 2022 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b]
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 21, 2022 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03]
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 21, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-21_04_45_27-1238241219485704708?project=apache-beam-testing
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-21_04_45_27-1238241219485704708
Jan 21, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-21_04_45_27-1238241219485704708
Jan 21, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-21T12:45:34.267Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-jjl8. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 21, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:39.624Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 21, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:40.805Z: Expanding SplittableParDo operations into optimizable parts.
Jan 21, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:40.831Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 21, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:40.901Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 21, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:40.957Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 21, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:40.987Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 21, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.054Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.154Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.183Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.211Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.234Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.260Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.292Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.313Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.369Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.398Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.459Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.489Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.516Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.551Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.589Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.612Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.637Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.693Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.727Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.763Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.798Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.832Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.868Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:41.902Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 21, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:45:42.266Z: Starting 5 ****s in us-central1-b...
Jan 21, 2022 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:46:11.251Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 21, 2022 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:46:33.098Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 21, 2022 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:47:35.483Z: Workers have started successfully.
Jan 21, 2022 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T12:47:35.516Z: Workers have started successfully.
Jan 21, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T16:00:39.016Z: Cancel request is committed for workflow job: 2022-01-21_04_45_27-1238241219485704708.
Jan 21, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T16:00:39.121Z: Cleaning up.
Jan 21, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T16:00:39.207Z: Stopping **** pool...
Jan 21, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T16:00:39.251Z: Stopping **** pool...
Jan 21, 2022 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T16:03:01.320Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 21, 2022 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-21T16:03:01.365Z: Worker pool stopped.
Jan 21, 2022 4:03:08 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-21_04_45_27-1238241219485704708 finished with status CANCELLED.
Load test results for test (ID): 031bac3d-6dad-4290-a04c-94f6c1f809bc and timestamp: 2022-01-21T12:45:21.410000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11550.924
dataflow_v2_java11_total_bytes_count             1.92177733E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220121124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ec3973dc0b0b6d83932bd6a73291cb4b0c401b28b0e248a92abc8aac20a04d4
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220121124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ec3973dc0b0b6d83932bd6a73291cb4b0c401b28b0e248a92abc8aac20a04d4]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220121124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ec3973dc0b0b6d83932bd6a73291cb4b0c401b28b0e248a92abc8aac20a04d4])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ec3973dc0b0b6d83932bd6a73291cb4b0c401b28b0e248a92abc8aac20a04d4
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ec3973dc0b0b6d83932bd6a73291cb4b0c401b28b0e248a92abc8aac20a04d4
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ec3973dc0b0b6d83932bd6a73291cb4b0c401b28b0e248a92abc8aac20a04d4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 49s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ttzkd24leiouc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #217

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/217/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13611] Skip test_xlang_jdbc_write (#16554)

[noreply] Merge pull request #16370 from [BEAM-13556] playground - color and

[noreply] Merge pull request #16531 from [BEAM-13567] [playground] Handle run code

[noreply] Merge pull request #16533 from [BEAM-13548] [Playground] Add example

[noreply] Merge pull request #16519 from [BEAM-13639] [Playground] Add

[noreply] Merge pull request #16518 from [BEAM-13619] [Playground] Add loading

[noreply] Merge pull request #16243 from

[noreply] [BEAM-13683] Make cross-language SQL example pipeline (#16567)

[noreply] [BEAM-13688] fixed type in BPG 4.5.3 window section (#16560)

[noreply] Remove obsolete commands from Inventory job. (#16564)

[noreply] Disable logging for memoization test. (#16556)

[noreply] Merge pull request #16472: [BEAM-13697] Add SchemaFieldNumber annotation

[noreply] Merge pull request #16373 from [BEAM-13515] [Playground] Hiding lines in


------------------------------------------
[...truncated 49.46 KB...]
b028f05ac4fb: Preparing
82e75fe2ed36: Preparing
c6a65b8e1d39: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
4c0cfed6299f: Waiting
fd8e4191e68e: Waiting
b028f05ac4fb: Waiting
6e1547a026db: Waiting
82e75fe2ed36: Waiting
c6a65b8e1d39: Waiting
c4a08df5a277: Waiting
3bb5258f46d2: Waiting
31892cc314cb: Waiting
26a504e63be4: Waiting
832e177bb500: Waiting
11936051f93b: Waiting
8bf42db0de72: Waiting
f9e18e59a565: Waiting
972fb55ee942: Waiting
dbd38091623a: Pushed
5bfcc17525be: Pushed
dfb08282e4da: Pushed
2e609110a9ca: Pushed
c97dc4b0f9a1: Pushed
972fb55ee942: Pushed
6e1547a026db: Pushed
c4a08df5a277: Pushed
fd8e4191e68e: Pushed
4c0cfed6299f: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
82e75fe2ed36: Pushed
c6a65b8e1d39: Pushed
b028f05ac4fb: Pushed
20220120124334: digest: sha256:1abde018df018c1b8a516c7fe21f822dba94e6c9ccd33a94e150c59c5f986ead size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 20, 2022 12:45:19 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 20, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 20, 2022 12:45:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 20, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 20, 2022 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 20, 2022 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 20, 2022 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 20, 2022 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113730 bytes, hash 68fd4bf719ae42a1cc00045dca80329da6f5e765d6a791444c2bb3f08f3e9bd9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aP1L9xmuQqHMAARdyoAynab152XWp5FETCuz8I8-m9k.pb
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 20, 2022 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e446d92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f9b467, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d5c2745, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b29496, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9]
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 20, 2022 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2]
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 20, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 20, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-20_04_45_25-3139908799002107795?project=apache-beam-testing
Jan 20, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-20_04_45_25-3139908799002107795
Jan 20, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-20_04_45_25-3139908799002107795
Jan 20, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-20T12:45:37.783Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-39nt. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 20, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:42.615Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.301Z: Expanding SplittableParDo operations into optimizable parts.
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.349Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.420Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.562Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.658Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.740Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.876Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.908Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.937Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:43.981Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.011Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.037Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.074Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.107Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.135Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.169Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.227Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.268Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.301Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.350Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.386Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.411Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.437Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.463Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.505Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.542Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.572Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.620Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 20, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:44.648Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 20, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:45.044Z: Starting 5 ****s in us-central1-b...
Jan 20, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:45:56.135Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 20, 2022 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:46:37.056Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 20, 2022 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:47:39.549Z: Workers have started successfully.
Jan 20, 2022 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T12:47:39.584Z: Workers have started successfully.
Jan 20, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T16:00:27.344Z: Cancel request is committed for workflow job: 2022-01-20_04_45_25-3139908799002107795.
Jan 20, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T16:00:27.583Z: Cleaning up.
Jan 20, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T16:00:27.688Z: Stopping **** pool...
Jan 20, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T16:00:27.781Z: Stopping **** pool...
Jan 20, 2022 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T16:02:49.441Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 20, 2022 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-20T16:02:49.487Z: Worker pool stopped.
Jan 20, 2022 4:02:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-20_04_45_25-3139908799002107795 finished with status CANCELLED.
Load test results for test (ID): fee768d0-065a-4997-8dc2-d65c601a0597 and timestamp: 2022-01-20T12:45:20.377000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11416.143
dataflow_v2_java11_total_bytes_count              2.0160757E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220120124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1abde018df018c1b8a516c7fe21f822dba94e6c9ccd33a94e150c59c5f986ead
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220120124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1abde018df018c1b8a516c7fe21f822dba94e6c9ccd33a94e150c59c5f986ead]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220120124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1abde018df018c1b8a516c7fe21f822dba94e6c9ccd33a94e150c59c5f986ead])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1abde018df018c1b8a516c7fe21f822dba94e6c9ccd33a94e150c59c5f986ead
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1abde018df018c1b8a516c7fe21f822dba94e6c9ccd33a94e150c59c5f986ead
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1abde018df018c1b8a516c7fe21f822dba94e6c9ccd33a94e150c59c5f986ead].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 38s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4nok7kysabrsa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #216

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/216/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Bump beam container version.

[alexander.zhuravlev] [BEAM-13680] Fixed code_repository (added pipelineUuid to RunCodeResult

[Robert Bradshaw] Also bump FnAPI container.

[noreply] [BEAM-13616][BEAM-13645] Switch to vendored grpc 1.43.2 (#16543)

[noreply] [BEAM-13616][BEAM-13646] Upgrade vendored calcite to 1.28.0:0.2 (#16544)

[noreply] Merge pull request #16486 from [BEAM-13544][Playground] Add logs to

[noreply] [BEAM-13683] Correct SQL transform schema, fix expansion address

[noreply] Update walkthrough.md (#16512)

[noreply] [BEAM-11808][BEAM-9879] Support aggregate functions with two arguments

[noreply] Merge pull request #16506 from [BEAM-13652][Playground] Send examples'

[noreply] Merge pull request #16322 from [BEAM-13407] [Playground] Preload fonts

[noreply] [BEAM-13665] Make SpannerIO projectID optional again (#16547)

[noreply] [BEAM-13015] Add state caching capability to be used as hint for runners

[noreply] Merge pull request #16309: [BEAM-13503] Set a default value to

[noreply] [BEAM-13015] Provide caching statistics in the status client. (#16495)


------------------------------------------
[...truncated 46.42 KB...]
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:expansion-service:classes
> Task :sdks:java:expansion-service:jar

> Task :sdks:java:io:kafka:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:kafka:classes
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:container:java11:copyDockerfileDependencies
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:io:google-cloud-platform:compileJava
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker


> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar

> Task :sdks:java:testing:load-tests:run
Jan 19, 2022 12:53:35 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 19, 2022 12:53:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 19, 2022 12:53:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 19, 2022 12:53:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 19, 2022 12:53:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 19, 2022 12:53:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 19, 2022 12:53:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 19, 2022 12:53:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113730 bytes, hash b13ce9a435c8fb499b4e71174a40c9be6948c42fd7194752387cc8b3b0f09c85> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sTzppDXI-0mbTnEXSkDJvmlIxC_XGUdSOHzIs7DwnIU.pb
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 19, 2022 12:53:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d5c2745, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b29496, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c847072, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43d9f1a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f86d8a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2264ea32, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3c09ec, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71e4b308, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11900483, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14a049f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@94e51e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5de6cf3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cc36c19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a3a1bf9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2100d047, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af45442]
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 19, 2022 12:53:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a]
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 19, 2022 12:53:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 19, 2022 12:53:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-19_04_53_41-169840831742093810?project=apache-beam-testing
Jan 19, 2022 12:53:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-19_04_53_41-169840831742093810
Jan 19, 2022 12:53:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-19_04_53_41-169840831742093810
Jan 19, 2022 12:53:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-19T12:53:48.696Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-rpwo. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:53.161Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:53.913Z: Expanding SplittableParDo operations into optimizable parts.
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:53.945Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.032Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.119Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.152Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.223Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.335Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.361Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.409Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.448Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.470Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.495Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.528Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.563Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.609Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.651Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.685Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.710Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.735Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.782Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.812Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.867Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.904Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.930Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:54.965Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:55.000Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:55.034Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:55.068Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 19, 2022 12:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:55.102Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 19, 2022 12:53:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:53:55.504Z: Starting 5 ****s in us-central1-b...
Jan 19, 2022 12:54:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:54:12.846Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 19, 2022 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:54:44.235Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 19, 2022 12:55:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:55:44.242Z: Workers have started successfully.
Jan 19, 2022 12:55:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T12:55:44.264Z: Workers have started successfully.
Jan 19, 2022 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T16:00:30.187Z: Cancel request is committed for workflow job: 2022-01-19_04_53_41-169840831742093810.
Jan 19, 2022 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T16:00:30.523Z: Cleaning up.
Jan 19, 2022 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T16:00:30.613Z: Stopping **** pool...
Jan 19, 2022 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T16:00:30.671Z: Stopping **** pool...
Jan 19, 2022 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T16:02:48.828Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 19, 2022 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-19T16:02:48.868Z: Worker pool stopped.
Jan 19, 2022 4:02:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-19_04_53_41-169840831742093810 finished with status CANCELLED.
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220119124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c81798c428844021c982c267f58fd91bbb039db9e0a4a46441ff3542772c72aa]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220119124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c81798c428844021c982c267f58fd91bbb039db9e0a4a46441ff3542772c72aa])].
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c81798c428844021c982c267f58fd91bbb039db9e0a4a46441ff3542772c72aa
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c81798c428844021c982c267f58fd91bbb039db9e0a4a46441ff3542772c72aa].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 41s
109 actionable tasks: 91 executed, 14 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/3oxkindqranj6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #215

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/215/display/redirect?page=changes>

Changes:

[mmack] [BEAM-8806] Integration test for SqsIO

[mmack] [BEAM-13631] Add deterministic SQS message coder to fix reading from SQS

[aydar.zaynutdinov] [BEAM-13641][Playground]

[noreply] Remove jcenter repositories from gradle configuration. (#16532)

[noreply] [BEAM-13430] Remove jcenter which will no longer contain any updates.

[noreply] [BEAM-13616] Update com.google.cloud:libraries-bom to 24.2.0 (#16509)


------------------------------------------
[...truncated 49.52 KB...]
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
722271f3a152: Waiting
f9e18e59a565: Waiting
832e177bb500: Waiting
26a504e63be4: Waiting
50c07935745a: Waiting
8bf42db0de72: Waiting
9212210b6007: Waiting
31892cc314cb: Waiting
11936051f93b: Waiting
23e63f1124c0: Waiting
3bb5258f46d2: Waiting
0e7875135127: Waiting
9c32afa495b6: Waiting
58641dfa8ea3: Waiting
677631deac9b: Waiting
b399fdbb2aa4: Pushed
e88494a0693b: Pushed
ad9410a238b7: Pushed
58641dfa8ea3: Pushed
b5a971611ba8: Pushed
722271f3a152: Pushed
310f66ea4350: Pushed
9212210b6007: Pushed
677631deac9b: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
50c07935745a: Pushed
23e63f1124c0: Pushed
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
9c32afa495b6: Pushed
0e7875135127: Pushed
20220118124848: digest: sha256:dac836ca72005995cd72a445950041cc9c35ee05d8e3de1c3a86c4c78874b9ba size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 18, 2022 12:50:44 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 18, 2022 12:50:44 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 18, 2022 12:50:45 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 18, 2022 12:50:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 18, 2022 12:50:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 18, 2022 12:50:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 1 seconds
Jan 18, 2022 12:50:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 18, 2022 12:50:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113698 bytes, hash 0cb206bb522ecd7715256b5a9ca6c678b9b783a15d28417f502064636f4f73a2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DLIGu1IuzXcVJWtanKbGeLm3g6FdKEF_UCBkY29Pc6I.pb
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 18, 2022 12:50:51 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@432f521f]
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 18, 2022 12:50:51 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fa6b65]
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 18, 2022 12:50:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 18, 2022 12:50:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-18_04_50_51-14740612117850174335?project=apache-beam-testing
Jan 18, 2022 12:50:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-18_04_50_51-14740612117850174335
Jan 18, 2022 12:50:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-18_04_50_51-14740612117850174335
Jan 18, 2022 12:50:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-18T12:50:59.151Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-8hbk. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:04.294Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.015Z: Expanding SplittableParDo operations into optimizable parts.
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.052Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.165Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.237Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.271Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.319Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.412Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.439Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.461Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.493Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.517Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.549Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.580Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.605Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.640Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.675Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.709Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.733Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.764Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 18, 2022 12:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.799Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.832Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.865Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.913Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.947Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:05.982Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:06.016Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:06.048Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:06.083Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:06.116Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 18, 2022 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:07.054Z: Starting 5 ****s in us-central1-b...
Jan 18, 2022 12:51:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:10.965Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 18, 2022 12:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:58.721Z: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 18, 2022 12:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:51:58.755Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
Jan 18, 2022 12:52:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:52:09.147Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 18, 2022 12:52:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:52:55.717Z: Workers have started successfully.
Jan 18, 2022 12:52:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T12:52:55.747Z: Workers have started successfully.
Jan 18, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T16:00:41.201Z: Cancel request is committed for workflow job: 2022-01-18_04_50_51-14740612117850174335.
Jan 18, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T16:00:41.269Z: Cleaning up.
Jan 18, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T16:00:41.347Z: Stopping **** pool...
Jan 18, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T16:00:41.401Z: Stopping **** pool...
Jan 18, 2022 4:03:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T16:03:09.667Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 18, 2022 4:03:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-18T16:03:09.739Z: Worker pool stopped.
Jan 18, 2022 4:03:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-18_04_50_51-14740612117850174335 finished with status CANCELLED.
Load test results for test (ID): b632a23d-4ce0-4257-b4f0-aac41b00737d and timestamp: 2022-01-18T12:50:45.103000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11157.359
dataflow_v2_java11_total_bytes_count             2.09129968E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220118124848
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dac836ca72005995cd72a445950041cc9c35ee05d8e3de1c3a86c4c78874b9ba
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220118124848]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dac836ca72005995cd72a445950041cc9c35ee05d8e3de1c3a86c4c78874b9ba]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220118124848] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dac836ca72005995cd72a445950041cc9c35ee05d8e3de1c3a86c4c78874b9ba])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dac836ca72005995cd72a445950041cc9c35ee05d8e3de1c3a86c4c78874b9ba
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dac836ca72005995cd72a445950041cc9c35ee05d8e3de1c3a86c4c78874b9ba
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dac836ca72005995cd72a445950041cc9c35ee05d8e3de1c3a86c4c78874b9ba].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 14m 46s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/tfdgdoymiafw4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #214

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/214/display/redirect?page=changes>

Changes:

[mmack] [BEAM-8806] Integration test for SqsIO using Localstack

[noreply] Merge pull request #16507: [BEAM-13137] Fixes ES utest size flakiness


------------------------------------------
[...truncated 49.41 KB...]
f1ef6929757d: Preparing
b77b9f639dab: Preparing
86e18190cb28: Preparing
957d46b6abc4: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
86e18190cb28: Waiting
b7487470add1: Waiting
f1ef6929757d: Waiting
957d46b6abc4: Waiting
7fbc39dcc9b0: Waiting
c6a90acafd41: Waiting
b77b9f639dab: Waiting
3bb5258f46d2: Waiting
832e177bb500: Waiting
f9e18e59a565: Waiting
8bf42db0de72: Waiting
f8843777e89a: Waiting
31892cc314cb: Waiting
11936051f93b: Waiting
eab39fe6ed46: Pushed
1f3138295f14: Pushed
1786af9fad54: Pushed
f8843777e89a: Pushed
4d370c837981: Pushed
522378014087: Pushed
7fbc39dcc9b0: Pushed
f1ef6929757d: Pushed
c6a90acafd41: Pushed
86e18190cb28: Pushed
3bb5258f46d2: Layer already exists
b7487470add1: Pushed
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
957d46b6abc4: Pushed
b77b9f639dab: Pushed
20220117124334: digest: sha256:5f6a819271593d7720664ac95381482ee91945556134d4ca521f74c6a85b7afe size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 17, 2022 12:45:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 17, 2022 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 17, 2022 12:45:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 17, 2022 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 17, 2022 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 17, 2022 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 1 seconds
Jan 17, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 17, 2022 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash f157ebddf4a9bcb2f28a37f5ea034d72e600d06ada3d99785eb6d8dab444fc4d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8Vfr3fSpvLLyijf16gNNcuYA0GraPZl4XrbY2rRE_E0.pb
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 17, 2022 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@432f521f]
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 17, 2022 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fa6b65]
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 17, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 17, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-17_04_45_24-468121990315778216?project=apache-beam-testing
Jan 17, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-17_04_45_24-468121990315778216
Jan 17, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-17_04_45_24-468121990315778216
Jan 17, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-17T12:45:31.917Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-rn5p. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:36.069Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:36.894Z: Expanding SplittableParDo operations into optimizable parts.
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:36.930Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.006Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.064Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.090Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.155Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.268Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.300Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.335Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.371Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.409Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.441Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.475Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.508Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.529Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.589Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.627Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.661Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.694Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.716Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.749Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.785Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.818Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.843Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.882Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.906Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.936Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.960Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 17, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:37.985Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 17, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:38.345Z: Starting 5 ****s in us-central1-b...
Jan 17, 2022 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:45:51.065Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 17, 2022 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:46:23.338Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 17, 2022 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:47:24.814Z: Workers have started successfully.
Jan 17, 2022 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T12:47:24.844Z: Workers have started successfully.
Jan 17, 2022 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T16:00:33.108Z: Cancel request is committed for workflow job: 2022-01-17_04_45_24-468121990315778216.
Jan 17, 2022 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T16:00:33.197Z: Cleaning up.
Jan 17, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T16:00:33.275Z: Stopping **** pool...
Jan 17, 2022 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T16:00:33.327Z: Stopping **** pool...
Jan 17, 2022 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T16:02:55.002Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 17, 2022 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-17T16:02:55.038Z: Worker pool stopped.
Jan 17, 2022 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-17_04_45_24-468121990315778216 finished with status CANCELLED.
Load test results for test (ID): 50b2011e-619f-4c3f-95aa-82d9b4353f54 and timestamp: 2022-01-17T12:45:18.494000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11547.851
dataflow_v2_java11_total_bytes_count             1.39522555E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220117124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5f6a819271593d7720664ac95381482ee91945556134d4ca521f74c6a85b7afe
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220117124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5f6a819271593d7720664ac95381482ee91945556134d4ca521f74c6a85b7afe]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220117124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5f6a819271593d7720664ac95381482ee91945556134d4ca521f74c6a85b7afe])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5f6a819271593d7720664ac95381482ee91945556134d4ca521f74c6a85b7afe
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5f6a819271593d7720664ac95381482ee91945556134d4ca521f74c6a85b7afe
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5f6a819271593d7720664ac95381482ee91945556134d4ca521f74c6a85b7afe].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 46s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://scans.gradle.com/s/hcdqv6n4fypmi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #213

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/213/display/redirect>

Changes:


------------------------------------------
[...truncated 48.70 KB...]
888359ca9712: Preparing
09abd1700af8: Preparing
a06a45d6c989: Preparing
9b4a3df3f9a5: Preparing
e251f6e49413: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
b71c3ea6d4ad: Waiting
5679b7a8343c: Waiting
832e177bb500: Waiting
625d95bb813d: Waiting
f9e18e59a565: Waiting
e251f6e49413: Waiting
3bb5258f46d2: Waiting
26a504e63be4: Waiting
9b4a3df3f9a5: Waiting
a06a45d6c989: Waiting
09abd1700af8: Waiting
8bf42db0de72: Waiting
11936051f93b: Waiting
f0012716c6c4: Pushed
347f860594a1: Pushed
026b708a42c6: Pushed
26d23edbd68b: Pushed
0587ec3095e2: Pushed
b71c3ea6d4ad: Pushed
625d95bb813d: Pushed
5679b7a8343c: Pushed
888359ca9712: Pushed
09abd1700af8: Pushed
9b4a3df3f9a5: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
e251f6e49413: Pushed
a06a45d6c989: Pushed
20220116124336: digest: sha256:3ee0eba6aa49a6a90fcee7d8d8be447d4fdbb22aefddc371ed450cfba84d3394 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 16, 2022 12:45:20 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 16, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 16, 2022 12:45:21 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 16, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 16, 2022 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 16, 2022 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 16, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 16, 2022 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash a8f5a88cc7b46c9247a1c1824df9bbe430eb2f10449bf63e5af36dcb2888903e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-qPWojMe0bJJHocGCTfm75DDrLxBEm_Y-WvNtyyiIkD4.pb
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 16, 2022 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4aa21f9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d]
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 16, 2022 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24eb65e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05]
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 16, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 16, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-16_04_45_26-1402912682750613138?project=apache-beam-testing
Jan 16, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-16_04_45_26-1402912682750613138
Jan 16, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-16_04_45_26-1402912682750613138
Jan 16, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-16T12:45:35.106Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-25i5. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 16, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:38.714Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.342Z: Expanding SplittableParDo operations into optimizable parts.
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.365Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.421Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.483Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.520Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.573Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.675Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.703Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.739Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.761Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.791Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.828Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.853Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.880Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.917Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.950Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:39.977Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.004Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.039Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.068Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.125Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.147Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.176Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.212Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.249Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 16, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.279Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 16, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.309Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 16, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.343Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 16, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.370Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 16, 2022 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:45:40.715Z: Starting 5 ****s in us-central1-b...
Jan 16, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:46:06.849Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 16, 2022 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:46:26.711Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 16, 2022 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:47:28.280Z: Workers have started successfully.
Jan 16, 2022 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T12:47:28.313Z: Workers have started successfully.
Jan 16, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T16:00:36.094Z: Cancel request is committed for workflow job: 2022-01-16_04_45_26-1402912682750613138.
Jan 16, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T16:00:36.158Z: Cleaning up.
Jan 16, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T16:00:36.236Z: Stopping **** pool...
Jan 16, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T16:00:36.332Z: Stopping **** pool...
Jan 16, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T16:02:56.841Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 16, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-16T16:02:56.878Z: Worker pool stopped.
Jan 16, 2022 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-16_04_45_26-1402912682750613138 finished with status CANCELLED.
Load test results for test (ID): 2bad41c9-b3e9-4dc3-b4e5-e4d914e9b16b and timestamp: 2022-01-16T12:45:21.253000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11561.821
dataflow_v2_java11_total_bytes_count              1.7153606E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220116124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee0eba6aa49a6a90fcee7d8d8be447d4fdbb22aefddc371ed450cfba84d3394
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220116124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee0eba6aa49a6a90fcee7d8d8be447d4fdbb22aefddc371ed450cfba84d3394]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220116124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee0eba6aa49a6a90fcee7d8d8be447d4fdbb22aefddc371ed450cfba84d3394])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee0eba6aa49a6a90fcee7d8d8be447d4fdbb22aefddc371ed450cfba84d3394
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee0eba6aa49a6a90fcee7d8d8be447d4fdbb22aefddc371ed450cfba84d3394
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3ee0eba6aa49a6a90fcee7d8d8be447d4fdbb22aefddc371ed450cfba84d3394].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 45s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://scans.gradle.com/s/ixnm2xcvfr2hy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #212

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/212/display/redirect?page=changes>

Changes:

[daria.malkova] Split builder into several builder for each step of pipeline execution

[Valentyn Tymofieiev] Provide API to check whether a hint is known.

[Valentyn Tymofieiev] [BEAM-12558] Fix doc typo.

[arietis27] [BEAM-13400] JDBC IO does not support UUID and JSONB PostgreSQL types

[jrmccluskey] [BEAM-10206] Resolve go vet errors in protox package

[noreply] Merge pull request #16482 from [BEAM-13429][Playground] Add builder for

[noreply] [BEAM-13590] Fix  abc imports from collections (#15850)

[jrmccluskey] Fix staticcheck errors in transforms directory

[jrmccluskey] Remove unnecessary fmt.Sprintf() in partition.go

[jrmccluskey] Replace bytes.Compare() with bytes.Equal() in test cases

[jrmccluskey] Replace string(buf.Bytes()) with buf.String() in coder_test.go

[jrmccluskey] Remove unnecessary blank identifier assignment in harness.go

[jrmccluskey] fix capitalized error strings in expansionx

[jrmccluskey] Clean up string cast of bytes in vet.go and corresponding tests

[jrmccluskey] Remove unnecessary fmt call in universal.go

[Robert Bradshaw] Remove tab from source.

[noreply] Redirecting cross-language transforms content (#16504)

[noreply] doc tweaks (#16498)

[noreply] [BEAM-12621] - Update Jenkins VMs to modern Ubuntu version (#16457)

[noreply] [BEAM-13664] Fix Primitives hashing benchmark (#16523)


------------------------------------------
[...truncated 50.03 KB...]
a38927abada2: Pushed
9d52d5eebd34: Pushed
a10ad78746e7: Pushed
2e4cadf9d5c9: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
5a5577dc5815: Pushed
26a504e63be4: Layer already exists
f9e18e59a565: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
1f004923d3a8: Pushed
9d2773eff5f0: Pushed
30c60089bba5: Pushed
20220115124341: digest: sha256:b4aab8fa68841baf5d9743fe7f91af1d690f6220e07724d9c38ac2c9914c8987 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 15, 2022 12:45:36 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 15, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 15, 2022 12:45:37 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 15, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 15, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 15, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 15, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 15, 2022 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 45a7534e666ed3951601614982644eb7c6862ce5a11db442404000696bcaba8b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-RadTTmZu05UWAWFJgmROt8aGLOWhHbRCQEAAaWvKuos.pb
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 15, 2022 12:45:42 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@432f521f]
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 15, 2022 12:45:42 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fa6b65]
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 15, 2022 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 15, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-15_04_45_42-5972495277282877666?project=apache-beam-testing
Jan 15, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-15_04_45_42-5972495277282877666
Jan 15, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-15_04_45_42-5972495277282877666
Jan 15, 2022 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-15T12:45:50.079Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-rqwp. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:54.494Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.278Z: Expanding SplittableParDo operations into optimizable parts.
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.304Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.374Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.438Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.463Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.518Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.609Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.636Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.671Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.695Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.721Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.757Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 15, 2022 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.791Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.823Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.859Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.891Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.923Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.967Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:55.998Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.024Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.055Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.091Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.122Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.157Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.191Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.217Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.254Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.287Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.310Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 15, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:45:56.692Z: Starting 5 ****s in us-central1-b...
Jan 15, 2022 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:46:02.931Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 15, 2022 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:46:47.362Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 15, 2022 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:47:52.664Z: Workers have started successfully.
Jan 15, 2022 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T12:47:52.696Z: Workers have started successfully.
Jan 15, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T16:00:38.442Z: Cancel request is committed for workflow job: 2022-01-15_04_45_42-5972495277282877666.
Jan 15, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T16:00:38.513Z: Cleaning up.
Jan 15, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T16:00:38.586Z: Stopping **** pool...
Jan 15, 2022 4:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T16:00:38.627Z: Stopping **** pool...
Jan 15, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T16:02:56.710Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 15, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-15T16:02:56.745Z: Worker pool stopped.
Jan 15, 2022 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-15_04_45_42-5972495277282877666 finished with status CANCELLED.
Load test results for test (ID): ac5c9b78-e09b-414f-aa13-8d8f6f1b5186 and timestamp: 2022-01-15T12:45:37.363000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11543.296
dataflow_v2_java11_total_bytes_count              2.3769515E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220115124341
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b4aab8fa68841baf5d9743fe7f91af1d690f6220e07724d9c38ac2c9914c8987
Deleted: sha256:6ec89fcd6dead460de8cd87816975a20e293d629f57e0f8aa1981b39370e2040
Deleted: sha256:9406e0918aa6a65159c6371950d2273b035ad79951eda982ed379436b1689e57
Deleted: sha256:dd9514ceb6ee45a4457d87ec0bf0b5bb189d4492cdbf449cd84cfacc21cda27d
Deleted: sha256:6a99389e62d556f6ac71f56691be344caad6548ffc6cf41ea014b8c91d64d7c8
Deleted: sha256:721d0e0120ae6e4047050032cc5bb22752d74800c579d3e8a673eba9c447805e
Deleted: sha256:802a3df89ae80be2b998d12ff94c22e3a0022411bb44726458fc3318d3120831
Deleted: sha256:b8af2d9d05f5c118aefffa98f3bd5b64a84412973cb055e000a6db5e2109145c
Deleted: sha256:e383524933f3f8e12c64759addeadc875d29760e7cc56aad30b7bc1de72f9e4c
Deleted: sha256:93426dade9c37475acad3f81a1cc8599441f2a1266b62507cfb1922279aae667
Deleted: sha256:b3f4bc6205bebc85299e08d46f5210d164c31b6ff94b7e143e7b018f6b706b90
Deleted: sha256:369815a2cc65062115ceead7dfdb1e190ced068001d990b992763b9c46d8ed07
Deleted: sha256:99185a549500d62eb458a53da1aef8973d06ba789664978d0a84a859d284d4ea
Deleted: sha256:887d72ed2a58e05999be76b910e954b14b1a5bccc7d452052e3ca1fc7073599f
Deleted: sha256:9803b77d27647efb5784ea0f56de4cd501e286014fdb4337d577628cc50ffe4e
Deleted: sha256:9a469452d2c5fa5b0d577dfba41a05e711ed2813da0a806919bb2859f8cab6e5
Deleted: sha256:0030d1934c18ad3a16d46ad62a96d56e62b295048f09c1e724c19acebc0b3e84
Deleted: sha256:d4e4e4ee1c4ae5b73007c248c9008245084d02d12b3f853f0789d45564531349
Deleted: sha256:08a1cf90ff937882dbd5453bd9b143c3065f46e4d5c074a651567af46ddfac85
Deleted: sha256:dfba063e8571cbb3571c0f15b8a07c8a3a0ea8f427b2b5a1452f8fc74533588a
Deleted: sha256:125b72147cf2bc55607b07c7a33484111fd818f340e46f877b9450534816fd93
Deleted: sha256:f8b4ce50e8e4a24af0ea5a1c4c920eb51ed299041d3df04fe0bde194a3681710
Deleted: sha256:0bdf15d948a75b72ad25465d206dab850503dc0e29d38251aafcfcc0417959f2
Deleted: sha256:d0f9b4d91ffbc5840208672b93ed50a68685c760f5641af4c280ca04311ed71e
Deleted: sha256:732ade92fd45a248c02906f2d610281e57b8a11a5995b8d4208a23ba4cd7180e
Deleted: sha256:35176e0f2523094dcbf74f6f254c9886ba62ded7033895e321303a2661073ef8
Deleted: sha256:79ab0703975668a2cb13182563c6c38f6282bd6ab481f9cbae4cc81e26999d58
Deleted: sha256:60d13ff9ddaf0288f8a0863f7221202d9be383cb19bcaf551358eef08626acdf
Deleted: sha256:c771463041a724f302dc5b9d507d2e374a132a1357a08403fda243f49cce7a16
Deleted: sha256:641d763d8fe2505905e663e0800378f074dd57e37ff4516054ec7291e5723199
Deleted: sha256:bdf0afb294ff605ab8e448018187cbe17186a6bcf40e19d23da6c5bbc7b6d063
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220115124341]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b4aab8fa68841baf5d9743fe7f91af1d690f6220e07724d9c38ac2c9914c8987]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220115124341] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b4aab8fa68841baf5d9743fe7f91af1d690f6220e07724d9c38ac2c9914c8987])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b4aab8fa68841baf5d9743fe7f91af1d690f6220e07724d9c38ac2c9914c8987
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b4aab8fa68841baf5d9743fe7f91af1d690f6220e07724d9c38ac2c9914c8987
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b4aab8fa68841baf5d9743fe7f91af1d690f6220e07724d9c38ac2c9914c8987].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 43s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://scans.gradle.com/s/n2hfr3kab5jhy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #211

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/211/display/redirect?page=changes>

Changes:

[blais] python sdk examples: Fixed typo in wordcount example.

[noreply] [BEAM-13480] Increase pipeline timeout for

[noreply] Stronger typing inference for CoGBK. (#16465)

[noreply] [BEAM-12464] Change ProtoSchemaTranslator beam schema creation to match

[noreply] Introduce the notion of a JoinIndex for fewer shuffles. (#16101)

[noreply] Merge pull request #16467 from [BEAM-12164]: SpannerIO

[noreply] Merge pull request #16385 from [BEAM-13535] [Playground] add cancel

[noreply] Merge pull request #16485 from [BEAM-13486] [Playground] For unit tests

[heejong] [BEAM-13455] Remove duplicated artifacts when using multiple

[noreply] [BEAM-12572] Run java examples on multiple runners (#16450)


------------------------------------------
[...truncated 49.39 KB...]
cb43c1fe0996: Preparing
f0a1111d4491: Preparing
d7c4e2d10daf: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
5b11b246262a: Waiting
832e177bb500: Waiting
f9e18e59a565: Waiting
f0a1111d4491: Waiting
397b05db04c8: Waiting
26a504e63be4: Waiting
1244ddaeae0c: Waiting
d7c4e2d10daf: Waiting
9e77d401e742: Waiting
8bf42db0de72: Waiting
cb43c1fe0996: Waiting
3bb5258f46d2: Waiting
3249656ac991: Waiting
31892cc314cb: Waiting
11936051f93b: Waiting
38ab8d346dc1: Pushed
132ed9479fa7: Pushed
463ecb5f1c66: Pushed
7cdd743ca3f8: Pushed
3249656ac991: Pushed
5b11b246262a: Pushed
6562c9c587c2: Pushed
1244ddaeae0c: Pushed
9e77d401e742: Pushed
397b05db04c8: Pushed
f0a1111d4491: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
d7c4e2d10daf: Pushed
cb43c1fe0996: Pushed
20220114124337: digest: sha256:3311899c6758cae2f19eef6b7f8558b3bd5d4408f1c381879372a16dee6ea8d5 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 14, 2022 12:45:19 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 14, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 14, 2022 12:45:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 14, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 14, 2022 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 14, 2022 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 14, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 14, 2022 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 9e97fc0410de10f7ade4bbba422a763d2d1d1ac1941ae34f9a9938dc9cb88544> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-npf8BBDeEPet5Lu6Qip2PS0dGsGUGuNPmpk43Jy4hUQ.pb
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 14, 2022 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@432f521f]
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 14, 2022 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fa6b65]
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 14, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 14, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-14_04_45_26-9432682779076989600?project=apache-beam-testing
Jan 14, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-14_04_45_26-9432682779076989600
Jan 14, 2022 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-14_04_45_26-9432682779076989600
Jan 14, 2022 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-14T12:45:33.703Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-5xzc. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 14, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:37.074Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:37.888Z: Expanding SplittableParDo operations into optimizable parts.
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:37.918Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:37.973Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.023Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.055Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.120Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.212Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.239Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.259Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.294Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.324Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.360Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.395Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.434Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.466Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.499Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.524Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.556Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.594Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.624Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.661Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.685Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.721Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.759Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.786Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.815Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.833Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.864Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:38.895Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 14, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:45:39.303Z: Starting 5 ****s in us-central1-b...
Jan 14, 2022 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:46:10.454Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 14, 2022 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:46:29.365Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 14, 2022 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:47:29.629Z: Workers have started successfully.
Jan 14, 2022 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T12:47:29.660Z: Workers have started successfully.
Jan 14, 2022 4:01:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T16:01:31.599Z: Cancel request is committed for workflow job: 2022-01-14_04_45_26-9432682779076989600.
Jan 14, 2022 4:01:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T16:01:31.689Z: Cleaning up.
Jan 14, 2022 4:01:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T16:01:31.773Z: Stopping **** pool...
Jan 14, 2022 4:01:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T16:01:31.838Z: Stopping **** pool...
Jan 14, 2022 4:03:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T16:03:56.825Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 14, 2022 4:03:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-14T16:03:56.862Z: Worker pool stopped.
Jan 14, 2022 4:04:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-14_04_45_26-9432682779076989600 finished with status CANCELLED.
Load test results for test (ID): 70334f63-b685-4db2-b8ee-d7c099a30f54 and timestamp: 2022-01-14T12:45:20.469000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11607.913
dataflow_v2_java11_total_bytes_count             2.42735823E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220114124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3311899c6758cae2f19eef6b7f8558b3bd5d4408f1c381879372a16dee6ea8d5
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220114124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3311899c6758cae2f19eef6b7f8558b3bd5d4408f1c381879372a16dee6ea8d5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220114124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3311899c6758cae2f19eef6b7f8558b3bd5d4408f1c381879372a16dee6ea8d5])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3311899c6758cae2f19eef6b7f8558b3bd5d4408f1c381879372a16dee6ea8d5
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3311899c6758cae2f19eef6b7f8558b3bd5d4408f1c381879372a16dee6ea8d5
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3311899c6758cae2f19eef6b7f8558b3bd5d4408f1c381879372a16dee6ea8d5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 44s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://scans.gradle.com/s/m7hbphjovuh3y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #210

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/210/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13638] Datatype of timestamp fields in SqsMessage for AWS IOs for

[noreply] Merge pull request #16469 from [BEAM-13623][Playground] [Bugfix] During

[noreply] Merge pull request #16149 from [BEAM-13113] [Playground] playground

[noreply] Merge pull request #16363 from [BEAM-13557] [Playground] show code

[noreply] Merge pull request #16374 from [BEAM-13398][Playground] Split LifeCycle

[noreply] [BEAM-13616][BEAM-13646] Update vendored calcite 1.28.0 with protobuf

[chamikaramj] Adds several example multi-language Python pipelines

[noreply] Merge pull request #16325 from [BEAM-13471] [Playground] Tag existing

[noreply] [BEAM-13399] Move service liveness polling to Runner type (#16487)


------------------------------------------
[...truncated 49.98 KB...]
92cadc1c340f: Pushed
015c4fab276a: Pushed
dd6d2620e6eb: Pushed
3df9b6f31d16: Pushed
5753c7b8c27f: Pushed
1fff288752e7: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
27d306cfcb2a: Pushed
afcfd170532b: Pushed
663df8e9d96f: Pushed
7875d03a61dd: Pushed
20220113124340: digest: sha256:b81bb13a3c6a55bdec04affd0d5c721e3fcf863ec4bad69f3c570661454df943 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 13, 2022 12:45:37 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 13, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 13, 2022 12:45:38 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 13, 2022 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 13, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 13, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 13, 2022 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 13, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash b1d0db656305729a8d6ed15cad89f5603f346a0ec7631f586db2029b214ca77c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sdDbZWMFcpqNbtFcrYn1YD80ag7HYx9YbbICmyFMp3w.pb
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 13, 2022 12:45:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@432f521f]
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 13, 2022 12:45:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fa6b65]
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 13, 2022 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 13, 2022 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-13_04_45_43-18335074291361404234?project=apache-beam-testing
Jan 13, 2022 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-13_04_45_43-18335074291361404234
Jan 13, 2022 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-13_04_45_43-18335074291361404234
Jan 13, 2022 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-13T12:45:51.003Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-4j6r. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:54.925Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:55.667Z: Expanding SplittableParDo operations into optimizable parts.
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:55.699Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:55.760Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:55.833Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:55.874Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:55.942Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.066Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.110Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.145Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.184Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.219Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.240Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.274Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.299Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.334Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.363Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.405Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.447Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.480Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.510Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.555Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.608Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.656Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.697Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.726Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.768Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.808Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.830Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 13, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:56.860Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 13, 2022 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:45:57.222Z: Starting 5 ****s in us-central1-b...
Jan 13, 2022 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:46:29.068Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 13, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:46:41.900Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 13, 2022 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:47:43.323Z: Workers have started successfully.
Jan 13, 2022 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T12:47:43.343Z: Workers have started successfully.
Jan 13, 2022 3:12:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:57.420Z: Staged package animal-sniffer-annotations-1.20-vt1E38otwrj1wIzR1vDgznQJTsZ3gSYJaOA_wOd1Iqw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/animal-sniffer-annotations-1.20-vt1E38otwrj1wIzR1vDgznQJTsZ3gSYJaOA_wOd1Iqw.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.297Z: Staged package beam-model-job-management-2.37.0-SNAPSHOT-8QB2g9ssFGaKGT56IbOMrjcQcE9XP-LgiRhvff3RHeg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.37.0-SNAPSHOT-8QB2g9ssFGaKGT56IbOMrjcQcE9XP-LgiRhvff3RHeg.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.371Z: Staged package beam-runners-core-java-2.37.0-SNAPSHOT-sTk1eIXjfmaDIL0P3bF0KPVVvPGkThUdBb57OJlTv5U.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-java-2.37.0-SNAPSHOT-sTk1eIXjfmaDIL0P3bF0KPVVvPGkThUdBb57OJlTv5U.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.495Z: Staged package beam-sdks-java-expansion-service-2.37.0-SNAPSHOT-CMK5mJnjvuuagVTF6zvpZgE5-7pMS8_SldvJq3otYp4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.37.0-SNAPSHOT-CMK5mJnjvuuagVTF6zvpZgE5-7pMS8_SldvJq3otYp4.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.542Z: Staged package beam-sdks-java-extensions-arrow-2.37.0-SNAPSHOT--6o0gUFJinkN4upVAkTzRdayVoYWlnjhjcsg2E3us1A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-arrow-2.37.0-SNAPSHOT--6o0gUFJinkN4upVAkTzRdayVoYWlnjhjcsg2E3us1A.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.613Z: Staged package beam-sdks-java-extensions-google-cloud-platform-core-2.37.0-SNAPSHOT-SrfdnSv8coZuWbihhjsTZYCINtfTYjEt4AvCESw7-dE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.37.0-SNAPSHOT-SrfdnSv8coZuWbihhjsTZYCINtfTYjEt4AvCESw7-dE.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.660Z: Staged package beam-sdks-java-extensions-protobuf-2.37.0-SNAPSHOT-hxuOLL6xOZT1Brk6taQ-FXgM8GLM-VmNPHe33xYH4j0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.37.0-SNAPSHOT-hxuOLL6xOZT1Brk6taQ-FXgM8GLM-VmNPHe33xYH4j0.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.753Z: Staged package beam-sdks-java-io-kafka-2.37.0-SNAPSHOT-BHRQGDx1kM7DSZHo50ZsxR2Y_2IANhpSpKUlptYGXGI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-kafka-2.37.0-SNAPSHOT-BHRQGDx1kM7DSZHo50ZsxR2Y_2IANhpSpKUlptYGXGI.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.795Z: Staged package beam-sdks-java-io-kinesis-2.37.0-SNAPSHOT-K5NpTSuLXSF3kKXHV9tZ2pX6bAANx1QDeOnHWoH7U1A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-kinesis-2.37.0-SNAPSHOT-K5NpTSuLXSF3kKXHV9tZ2pX6bAANx1QDeOnHWoH7U1A.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.842Z: Staged package beam-sdks-java-io-synthetic-2.37.0-SNAPSHOT-jarS5mJehhs9eb-NZbUKGKDXV420CBUVy0a4JE4KVs4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-synthetic-2.37.0-SNAPSHOT-jarS5mJehhs9eb-NZbUKGKDXV420CBUVy0a4JE4KVs4.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.893Z: Staged package beam-sdks-java-load-tests-2.37.0-SNAPSHOT-Mb3UZLeMQrMhn1ge-C2r1r257amJgt9KHHOfFBeIOhQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-load-tests-2.37.0-SNAPSHOT-Mb3UZLeMQrMhn1ge-C2r1r257amJgt9KHHOfFBeIOhQ.jar' is inaccessible.
Jan 13, 2022 3:12:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:12:58.938Z: Staged package beam-sdks-java-test-utils-2.37.0-SNAPSHOT-uDXAbi02WU3WDaNGqETzLc5tbQsW_968W1HOImTwz-s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.37.0-SNAPSHOT-uDXAbi02WU3WDaNGqETzLc5tbQsW_968W1HOImTwz-s.jar' is inaccessible.
Jan 13, 2022 3:13:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-13T15:13:01.491Z: Staged package opencensus-contrib-http-util-0.28.0-ScPbKinx_bL3OSjL6pab0dQPq3zFu2JzAiur2W96eJs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-http-util-0.28.0-ScPbKinx_bL3OSjL6pab0dQPq3zFu2JzAiur2W96eJs.jar' is inaccessible.
Jan 13, 2022 3:13:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-13T15:13:02.061Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 13, 2022 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T16:00:33.694Z: Cancel request is committed for workflow job: 2022-01-13_04_45_43-18335074291361404234.
Jan 13, 2022 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T16:00:33.758Z: Cleaning up.
Jan 13, 2022 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T16:00:33.832Z: Stopping **** pool...
Jan 13, 2022 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T16:00:33.883Z: Stopping **** pool...
Jan 13, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T16:02:56.273Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 13, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-13T16:02:56.308Z: Worker pool stopped.
Jan 13, 2022 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-13_04_45_43-18335074291361404234 finished with status CANCELLED.
Load test results for test (ID): aca1ea46-0b56-402b-8e02-839cf487ae34 and timestamp: 2022-01-13T12:45:37.952000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11530.456
dataflow_v2_java11_total_bytes_count             2.41150937E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220113124340
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b81bb13a3c6a55bdec04affd0d5c721e3fcf863ec4bad69f3c570661454df943
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220113124340]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b81bb13a3c6a55bdec04affd0d5c721e3fcf863ec4bad69f3c570661454df943]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220113124340] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b81bb13a3c6a55bdec04affd0d5c721e3fcf863ec4bad69f3c570661454df943])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b81bb13a3c6a55bdec04affd0d5c721e3fcf863ec4bad69f3c570661454df943
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b81bb13a3c6a55bdec04affd0d5c721e3fcf863ec4bad69f3c570661454df943
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b81bb13a3c6a55bdec04affd0d5c721e3fcf863ec4bad69f3c570661454df943].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 42s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ypmx5uhgkzxjg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #209

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/209/display/redirect?page=changes>

Changes:

[ningkang0957] [BEAM-13602] Prevented metrics gathering from failing bigtable io

[ningkang0957] make the code more pythonic

[mmack] [BEAM-13243][BEAM-8374] Add support for missing PublishResponse fields

[Robert Bradshaw] Optional args and kwargs for named external transforms.

[noreply] [BEAM-13628] Update SideInputCache to use full Transform and

[noreply] [BEAM-13432] Skip ExpansionService creation in Job Server (#16222)

[noreply] [BEAM-13616] Initial files for vendored gRPC 1.43.2 (#16460)


------------------------------------------
[...truncated 51.90 KB...]
Jan 12, 2022 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 12, 2022 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 6522d52742e16e054120c6f4efe9c40732f4b02644e43f9cbc0517ce707fc1d5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ZSLVJ0LhbgVBIMb07-nEBzL0sCZE5D-cvAUXznB_wdU.pb
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 12, 2022 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@432f521f]
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 12, 2022 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fa6b65]
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 12, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 12, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-12_04_45_22-2112328189684646921?project=apache-beam-testing
Jan 12, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-12_04_45_22-2112328189684646921
Jan 12, 2022 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-12_04_45_22-2112328189684646921
Jan 12, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-12T12:45:32.394Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-zia0. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:37.324Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:37.975Z: Expanding SplittableParDo operations into optimizable parts.
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.013Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.080Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.177Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.205Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.270Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.362Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.397Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.429Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.463Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.499Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.533Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.575Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.621Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.651Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.675Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.699Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.735Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.761Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.785Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.823Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.846Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.872Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.909Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.941Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:38.979Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:39.033Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:39.057Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:39.092Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 12, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:45:39.442Z: Starting 5 ****s in us-central1-b...
Jan 12, 2022 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:46:02.553Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 12, 2022 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:46:22.971Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 12, 2022 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:47:24.007Z: Workers have started successfully.
Jan 12, 2022 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T12:47:24.043Z: Workers have started successfully.
Jan 12, 2022 1:25:54 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Jan 12, 2022 1:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T13:51:40.929Z: Staged package gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar' is inaccessible.
Jan 12, 2022 1:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T13:51:41.206Z: Staged package google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar' is inaccessible.
Jan 12, 2022 1:51:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T13:51:43.788Z: Staged package util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar' is inaccessible.
Jan 12, 2022 1:51:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-12T13:51:43.863Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 12, 2022 1:54:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-12T13:54:44.130Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 12, 2022 1:57:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T13:57:41.082Z: Staged package gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar' is inaccessible.
Jan 12, 2022 1:57:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T13:57:41.267Z: Staged package google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar' is inaccessible.
Jan 12, 2022 1:57:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T13:57:43.441Z: Staged package util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar' is inaccessible.
Jan 12, 2022 1:57:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-12T13:57:43.533Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 12, 2022 2:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-12T14:00:43.788Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 12, 2022 2:03:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T14:03:41.562Z: Staged package gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar' is inaccessible.
Jan 12, 2022 2:03:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T14:03:41.735Z: Staged package google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar' is inaccessible.
Jan 12, 2022 2:03:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-12T14:03:44.117Z: Staged package util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar' is inaccessible.
Jan 12, 2022 2:03:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-12T14:03:44.199Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 12, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T16:00:37.254Z: Cancel request is committed for workflow job: 2022-01-12_04_45_22-2112328189684646921.
Jan 12, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T16:00:37.292Z: Cleaning up.
Jan 12, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T16:00:37.361Z: Stopping **** pool...
Jan 12, 2022 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T16:00:37.419Z: Stopping **** pool...
Jan 12, 2022 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T16:03:08.424Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 12, 2022 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-12T16:03:08.462Z: Worker pool stopped.
Jan 12, 2022 4:03:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-12_04_45_22-2112328189684646921 finished with status CANCELLED.
Load test results for test (ID): d69e0a5f-b8fc-4db2-b8f3-b7f64275daa1 and timestamp: 2022-01-12T12:45:17.603000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11556.566
dataflow_v2_java11_total_bytes_count              2.5748142E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:644ea5771678e45504dcc8c323ea4f587c10cd548c21ff535811bd26e6e6f511
Deleted: sha256:0dfc9854b29ecdf055f1a42d1e398367977085dba482d0c8327d549aff737cf5
Deleted: sha256:a7ce9f3e98d3a30793328e3d86904ebe993029b65db25c864dbc67ab3376f2aa
Deleted: sha256:bb793563763b6e29ab48bb341b75591515dbbd697a5c0034333368da152e80d0
Deleted: sha256:53d40fcfaa8360999ac629d6ddc422ddad2fabced1b475953b75d819a6b458f2
Deleted: sha256:3bebf7a0f89af4357ae770d74649c3825cb3f35948685ad3eaa553e8d4f53ba1
Deleted: sha256:3ad6a3a5ded08454a5c93b3829329999ba1d6ed96e2ba1c4972adbb6f142fcae
Deleted: sha256:3364957c99fc3e105cd364238d8cf0de55e1310d933b4fb5a10f80396ad5d21b
Deleted: sha256:5b4fc224828c6ed98d577a185f18175c3fc506a7b82300e8e0fcd434e531ad63
Deleted: sha256:9b9cb659fcf54a259715307f353c0be84706c7789e34adf1b804d6fa5c097314
Deleted: sha256:b49bb393c62913aae01b4bb1ca8430c039bdb4848d175bf97cc73b638a804081
Deleted: sha256:d0c06bdcd092469d21c5586c75d8b31eb3f4246d3586c126d49e5a9c14a8537a
Deleted: sha256:15ff7d38261dbfffac9fcc75bd2754d5a90bf9a465e5756ec3087f24147a8a89
Deleted: sha256:7155b87e21468b2b3cf94b07e4b1786d75d50bfa33878c0edb5fe3cb9b40b16e
Deleted: sha256:29c6c85b48dbe99fcee69697fc91fcf888576fe8d9c445b120af1a3369231101
Deleted: sha256:e79aae3f0bfa8cd09685a25fde7fb97d57b38caf01b3546a9a3fc2a3d7e5747c
Deleted: sha256:3e8890237274d668fea6f34298ea6f4f31959102dfaa9f0cc237db9bfbab8cf8
Deleted: sha256:59fe8ac607ce86a2e423006de90e0a2de4a526e5d98ececadd19f9f42d7409ac
Deleted: sha256:b46fffbf14394c0a3eac04831d609a53ce93da7b0a4510d1a27e879ef4ba0fb6
Deleted: sha256:717cc93369dd7e8dc2e29fd3fc8d65e42b7764bf3cd0c793f3cf080734d24df8
Deleted: sha256:363270ea08d7b1d60c83d3587f8818a07ad30ef25fb4bad2029025e587bf14f0
Deleted: sha256:1a8c546957bd16daecd5308fa3d3433010ae9f208aa1cf606a56163608ec8f6f
Deleted: sha256:b81e48de15e461cde6c538a541bdd2cb9458411a81b0a199cab4fdba0b36272b
Deleted: sha256:3d8b798767443fcc97c23db81bd5244495b322e51cb9c2fcc94921c999e57689
Deleted: sha256:e1850bd3ab12ef03ed4a58607abf2861a8ee248a88f37747bcf1ae976d846bb7
Deleted: sha256:7791a3541d594230ac548fd9296107cbcc9a9d61f60c596358d3628df622e517
Deleted: sha256:eb685e3b53aa6bf21e9a15625559b33994f5079a423bebbf63c037b8a3ee4b13
Deleted: sha256:e7668c49143ff404a6e4a559adc0a61178ca2e044d809dc2e1bd2c5b08d8d076
Deleted: sha256:cda3b5a3ef13623d08b2489ffcff3bcdc090e0c931d8c190abd5aeb09e6cc8c7
Deleted: sha256:9bb76884f8cfe43892bf80649b821cbf4dc50ad6a3f3d9c880c753028b34fcf6
Deleted: sha256:7b86fc8d765768e6ddf9399618660d19a9ff1315be38d2db2a469e95b21f1c8b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:644ea5771678e45504dcc8c323ea4f587c10cd548c21ff535811bd26e6e6f511]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:644ea5771678e45504dcc8c323ea4f587c10cd548c21ff535811bd26e6e6f511])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:644ea5771678e45504dcc8c323ea4f587c10cd548c21ff535811bd26e6e6f511
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:644ea5771678e45504dcc8c323ea4f587c10cd548c21ff535811bd26e6e6f511
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:644ea5771678e45504dcc8c323ea4f587c10cd548c21ff535811bd26e6e6f511].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 58s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/s27airpsigb3w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #208

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/208/display/redirect?page=changes>

Changes:

[relax] don't close streams in finalize

[noreply] Loosen typing extensions bound

[Alexey Romanenko] [BEAM-4868] Bump com.amazonaws to 1.12.135

[noreply] [BEAM-12092] Bump jedis to version 4.0.1 (#16287)

[noreply] [BEAM-13534] Add automated port polling to expansion service runner if

[noreply] Merge pull request #16344 from [BEAM-13536][Playground][Bugfix] CI step

[noreply] Merge pull request #16359 from [BEAM-13545][Playground] Add

[noreply] Merge pull request #16384 from [BEAM-13308] [Playground] Getting

[noreply] Merge pull request #16306 from [BEAM-13447] [Playground] Add filling of

[noreply] Merge pull request #16361 from [BEAM-13543][Playground] Add logic of

[noreply] [BEAM-12562] Dataframe pipe implementation (#16256)

[noreply] Merge pull request #16338 from [BEAM-13528][Playground] Add liveness

[noreply] [BEAM-13626] Remap expanded outputs after merging. (#16471)

[noreply] Merge pull request #16147 from [BEAM-13359] [Playground] Tag existing

[noreply] [BEAM-3221] Improve documentation in model pipeline protos (#16474)

[noreply] [BEAM-13614] Add OnWindowExpiration support to the Java SDK harness and

[noreply] Merge pull request #16156 from [BEAM-13391] Fix temporary file format in

[mmack] [adhoc] Run spotlessApply on java examples to fix master


------------------------------------------
[...truncated 50.31 KB...]
79e74bb66b4c: Pushed
767109ab0085: Pushed
82f2be288639: Pushed
bc316e8f0ec8: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
a3899668c851: Pushed
2441839b891a: Pushed
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
0d6b484a5cd0: Pushed
92e46d8242ac: Pushed
20220111124551: digest: sha256:1ddc582ebc1c60bfe9a1b8605dbad5ff0f60246b05a617d1ff911f9c33443eee size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 11, 2022 12:50:13 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 11, 2022 12:50:14 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 11, 2022 12:50:17 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 11, 2022 12:50:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 11, 2022 12:50:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 11, 2022 12:50:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 5 seconds
Jan 11, 2022 12:50:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 11, 2022 12:50:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash f70c1dafa2c18a8791e54430fcce566ee19b5d2602ac9f089c3f988c575f1e5f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9wwdr6LBioeR5UQw_M5WbuGbXSYCrJ8InD-YjFdfHl8.pb
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 11, 2022 12:50:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33ecbd6c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c723f2d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@432f521f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d7a9786, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7bab5898]
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 11, 2022 12:50:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6504a875, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35e26d05, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29fa6b65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c72ecc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47406941]
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 11, 2022 12:50:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 11, 2022 12:50:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 11, 2022 12:50:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 11, 2022 12:50:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-11_04_50_39-13865195789965900416?project=apache-beam-testing
Jan 11, 2022 12:50:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-11_04_50_39-13865195789965900416
Jan 11, 2022 12:50:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-11_04_50_39-13865195789965900416
Jan 11, 2022 12:50:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-11T12:50:47.162Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-ssqc. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 11, 2022 12:50:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:50.799Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 11, 2022 12:50:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:51.592Z: Expanding SplittableParDo operations into optimizable parts.
Jan 11, 2022 12:50:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:51.627Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 11, 2022 12:50:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:51.708Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 11, 2022 12:50:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:51.784Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 11, 2022 12:50:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:51.824Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 11, 2022 12:50:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:51.892Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.011Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.052Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.090Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.129Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.158Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.194Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.233Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.277Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.300Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.348Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.418Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.468Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.516Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.553Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.579Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.611Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.653Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.687Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.716Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.761Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.793Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.817Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:52.859Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 11, 2022 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:50:53.230Z: Starting 5 ****s in us-central1-f...
Jan 11, 2022 12:51:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:51:08.681Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 11, 2022 12:51:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:51:37.357Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 11, 2022 12:52:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:52:40.678Z: Workers have started successfully.
Jan 11, 2022 12:52:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T12:52:40.723Z: Workers have started successfully.
Jan 11, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T16:00:37.859Z: Cancel request is committed for workflow job: 2022-01-11_04_50_39-13865195789965900416.
Jan 11, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T16:00:37.959Z: Cleaning up.
Jan 11, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T16:00:38.045Z: Stopping **** pool...
Jan 11, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T16:00:38.115Z: Stopping **** pool...
Jan 11, 2022 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T16:03:00.877Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 11, 2022 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-11T16:03:00.938Z: Worker pool stopped.
Jan 11, 2022 4:03:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-11_04_50_39-13865195789965900416 finished with status CANCELLED.
Load test results for test (ID): 6c7e72b3-10d3-4bdf-b17c-c728559d1dc5 and timestamp: 2022-01-11T12:50:15.651000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11246.394
dataflow_v2_java11_total_bytes_count             1.67860974E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220111124551
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ddc582ebc1c60bfe9a1b8605dbad5ff0f60246b05a617d1ff911f9c33443eee
Deleted: sha256:fa8c0e241abe97db0594353ebe104dde22b84b5c1b1abb582ce661bcf7e820c0
Deleted: sha256:3d30f8eed14cdd09eb1a17d800a6b7adb5b95bdbbf538817a7cf9778c7e32e56
Deleted: sha256:d0ef289bdb7c4c3b91f06964ff3a922260dd21d5e134a1d0148489cf1090c39f
Deleted: sha256:01509efe3a40fa8506cde87ccbc6a428fb9036ca414bbbc0620f1b53e13dcf9b
Deleted: sha256:c51f7c819bf73e620e85c93a562eb3e8a4b466cb7a7a7423b9720eebe2259952
Deleted: sha256:bd0f302f96aecae043163f9dd2b75596f43b2153ba624055789b92e8f818948e
Deleted: sha256:aa827bc37ccfb1e87382b2fc6a44882edb662a8b2859634fb8a919dcaad9f5be
Deleted: sha256:fdf67147c0f4b3f49e3fd7fe205cc384733d7934e87e83148552aba67cd98abc
Deleted: sha256:6a37d7672c308246fafcaba4581e2fda5cf9439e161c3dce60c5a905e0fbef54
Deleted: sha256:3b1a05917abcd7bb5655b3ae611f16f718fb83a2091a90ee6db3e8157e2a4b13
Deleted: sha256:468a1a98f868e4c5b0a38cc87489d434b0ab8dd0622184ceaac641a119ce35f0
Deleted: sha256:b34d75f5fde59b04ef9a91354dffb789dbef1bff8411c3abe23bf81b303f1a2b
Deleted: sha256:d32fc5d59db8365ec7cba18fea4026399c6c8ebeb8be30c44102a94bb855cf27
Deleted: sha256:af93d20e19f0c83d3733beeb8697a9517e87a3ddc779e59865f396304b71b531
Deleted: sha256:419ded4f7a31274087ef908da2628bdd5f8a3bf062577f031384c02f458ff49c
Deleted: sha256:60654b57228d25313b5fabea64fe91733b102b90fa84ec4adc5a6f71d58f08e1
Deleted: sha256:512fa6cd261a4bcd296d6d160d636ceb65984e217f65f146b3a9c9d4b57fe0b7
Deleted: sha256:2e2505dcda6216420c6935ed34ba16f65f9137522bafa6bf9ac46ebbe4026a4d
Deleted: sha256:026364ec2081d163bf76d84e8b7a967fa6ea874c301153ad6081560cef3546dc
Deleted: sha256:b6a9710454434f0f9801d342a4e02557f3201158a7430aa3a78694b4e75605f3
Deleted: sha256:fd7200751f4c714cf14f4fe8d1d5e53d4654f0ae682eb8a8b17b351c0353969c
Deleted: sha256:f4814528739ca31004037e9810860294f9c0e26fefecddc1473f5418965d9897
Deleted: sha256:661ece5de9e7d4608364905efd781773f79f11640c99e7c5f3dff71c6ecce5c2
Deleted: sha256:c12cf600ef5d02af63441cdf565cac7684f4d2644c071b7530c27906110b6cdf
Deleted: sha256:22853f80256fbb2351ce641821b1d6cc8f5c8f9e42e0211cd675399cdf8972ef
Deleted: sha256:3e47d4a2c6d65227a761f6084dbd7c2df28b555780cb4c01129a00b121461653
Deleted: sha256:e73c6408e63e53722487e8292c4971c8aebd1ee5141f5f38211cd9b5246c4127
Deleted: sha256:2a42f8e9eb56c8a1568e0b4254c3bea31bb71fe3db14e424341db7cd434694bc
Deleted: sha256:ead634f1494fe7553a5c0196e52a80beedf1167daba8104cd2ab37de5a9df0cb
Deleted: sha256:68e886c63ba834f77a04b7b58c9c5773e5105accdf5c31d3405e7c1873351819
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220111124551]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ddc582ebc1c60bfe9a1b8605dbad5ff0f60246b05a617d1ff911f9c33443eee]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220111124551] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ddc582ebc1c60bfe9a1b8605dbad5ff0f60246b05a617d1ff911f9c33443eee])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ddc582ebc1c60bfe9a1b8605dbad5ff0f60246b05a617d1ff911f9c33443eee
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ddc582ebc1c60bfe9a1b8605dbad5ff0f60246b05a617d1ff911f9c33443eee
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ddc582ebc1c60bfe9a1b8605dbad5ff0f60246b05a617d1ff911f9c33443eee].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 18m 34s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/njrz3wln5udns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #207

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/207/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [BEAM-8727] Bump software.amazon.awssdk to 2.17.106


------------------------------------------
[...truncated 50.03 KB...]
dc39cbdf68ec: Pushed
8dff6234b371: Pushed
aaa06dfc4b09: Pushed
781753bd57dc: Pushed
3bb5258f46d2: Layer already exists
97b5df71f94d: Pushed
4065f5dd1ce4: Pushed
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
8bf42db0de72: Layer already exists
2433e8a37a43: Pushed
2cfca14c1553: Pushed
20220110125822: digest: sha256:63aa69ba85114b9a7e55d1e8cf4c7c5ef2151b9ae2729504823103f9c2e7130c size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 10, 2022 1:00:07 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 10, 2022 1:00:07 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 10, 2022 1:00:08 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 10, 2022 1:00:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 10, 2022 1:00:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 10, 2022 1:00:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 10, 2022 1:00:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 10, 2022 1:00:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 388b486776f0e09d258c614d195174106d80d9980d6b4542c5a2f9b30810eef3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-OItIZ3bw4J0ljGFNGVF0EG2A2ZgNa0VCxaL5swgQ7vM.pb
Jan 10, 2022 1:00:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 10, 2022 1:00:12 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37045b48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60b34931, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4aa21f9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@862f408, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@178f268a]
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 10, 2022 1:00:13 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@791c12e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b112b13, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24eb65e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a22c4d8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cd7bc5]
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 10, 2022 1:00:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 10, 2022 1:00:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-10_05_00_13-957728361604083376?project=apache-beam-testing
Jan 10, 2022 1:00:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-10_05_00_13-957728361604083376
Jan 10, 2022 1:00:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-10_05_00_13-957728361604083376
Jan 10, 2022 1:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-10T13:00:22.542Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-pqck. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 10, 2022 1:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:25.918Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 10, 2022 1:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.521Z: Expanding SplittableParDo operations into optimizable parts.
Jan 10, 2022 1:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.551Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 10, 2022 1:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.615Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 10, 2022 1:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.686Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 10, 2022 1:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.713Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 10, 2022 1:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.779Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.886Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.930Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.957Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:26.981Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.015Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.039Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.061Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.096Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.117Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.141Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.165Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.200Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.235Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.257Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.279Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.306Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.341Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.371Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.405Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.466Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.502Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.534Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.558Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 10, 2022 1:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:27.878Z: Starting 5 ****s in us-central1-f...
Jan 10, 2022 1:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:00:43.129Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 10, 2022 1:01:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:01:13.115Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 10, 2022 1:02:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:02:12.988Z: Workers have started successfully.
Jan 10, 2022 1:02:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T13:02:13.052Z: Workers have started successfully.
Jan 10, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T16:00:41.704Z: Cancel request is committed for workflow job: 2022-01-10_05_00_13-957728361604083376.
Jan 10, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T16:00:41.762Z: Cleaning up.
Jan 10, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T16:00:41.833Z: Stopping **** pool...
Jan 10, 2022 4:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T16:00:41.898Z: Stopping **** pool...
Jan 10, 2022 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T16:03:09.599Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 10, 2022 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-10T16:03:09.629Z: Worker pool stopped.
Jan 10, 2022 4:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-10_05_00_13-957728361604083376 finished with status CANCELLED.
Load test results for test (ID): e507aab1-8236-48ec-a3fe-a38e8f856745 and timestamp: 2022-01-10T13:00:07.869000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 10672.675
dataflow_v2_java11_total_bytes_count             1.42755092E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220110125822
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:63aa69ba85114b9a7e55d1e8cf4c7c5ef2151b9ae2729504823103f9c2e7130c
Deleted: sha256:fa0e1a4ce5b525b74b8cf0aafcdbdb045329d0d7370e6696ba4ad26cef3489ef
Deleted: sha256:85acfc7854af4e97f17d405b70ac5e7a7e17e7a09178e588316b1dedc5d1706b
Deleted: sha256:ace68236c694a0a5433bc06318009de33e862b7e03bf29532edcc1455bbde412
Deleted: sha256:01b34adcd0d0293859c5f33ec501837743a1ce87b42adf3caeea91d0bd3ec26c
Deleted: sha256:1139ea4cfbd758b08ce84d6a90e1bc54e6427d4df91098338ccfd5eb1676c367
Deleted: sha256:4d427d4c444fc118741eb98a1c81bb9e4de9147c3fd70486fa8982e7544eff16
Deleted: sha256:05d6c5c335f013051c496942b3f8572d24551b8623e1ee2c4b157d31b5242070
Deleted: sha256:87f143ec89ef0e6684282a8f9ab743241ec1c2701f1a5818ffc1779aa5ccb292
Deleted: sha256:556cee2737c37c3fb362525348494d84870cef7461ff38d976daf7f3d90df16b
Deleted: sha256:bccc893ab7dfd97aec6a49d1c4758a8997bcb4d7a596a8a7c685edb7cb5910b5
Deleted: sha256:a7ef68fa8820af19b21ba3c986f7ceb305393a408b758f3671b78ef72fb489a2
Deleted: sha256:957e6d572f767bf39858b557ac23276264cda27459d2366ff405e958e0e88284
Deleted: sha256:c9fdcfd0e1f3465fd0192b48f8c4a4ff539559892c039c389baa057b4d9114ac
Deleted: sha256:1c98ab856d749af49de5c8d8d518225dc250206ccde0f728535a51669d847bf2
Deleted: sha256:d9e66597176a1120109786215b146cd6c23a285b9246624ee9203892f969e2b0
Deleted: sha256:c20972ec3c14861f6769c488569233d6b66e48e96fb5434faf9e979f5d16018a
Deleted: sha256:d3dd2cda4fbba257a2d384735e2460a6396505e0c2f6329b5eb9232f1334d571
Deleted: sha256:9db193f2f1eac5bbefc67715dc10547b8dffe5221068d0ecb777586d0d1fc1e4
Deleted: sha256:751b35ff270e0728b29808c54150b830ee37b9404c3af033f8a96e276444c5d1
Deleted: sha256:4a316c6804be6bcf4a5bf1f46649a67090f421a7ffe6561059d7a04214de30f2
Deleted: sha256:99a6f2c086cc67f50b6a307b6c2781176404ecfb58b2d74bd18a4480adc5999f
Deleted: sha256:cc3203d7ab3cf78fc7efce3dfa05c125ea70c36282b04597e84db367df456db3
Deleted: sha256:b946168f2939b86c9a4c9fb5617ba79b95d9cb507560784c8703b49953a45dfb
Deleted: sha256:666dd19db6bbc81187c4ec4fd7f4bd13649f956b219dd9923603c3df94de6be1
Deleted: sha256:884d8f324381440e42556e0478a0c68ee7aee10e49a0f5d03d37d22c71cf8cef
Deleted: sha256:b7e9f8b8820a373a4a4f792b235f0af3b95588e85aab2913e311e9467a63a8a5
Deleted: sha256:7b3c51279f492fc862c1ae6e48abe722061d3c350c81e5919cbd227389bfb22a
Deleted: sha256:15cb9b0e04f34ad82d94a969f66dd3644b559b3c5c15d7d30cf35442c9d47b86
Deleted: sha256:c36325729922a2f2eb07663a16ffdd4cef1b753ace054985e421904986b8d802
Deleted: sha256:67e1725730cbb360174ffe8d552143a62cfec96cdd07dd27857cbf025fd58fcc
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220110125822]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:63aa69ba85114b9a7e55d1e8cf4c7c5ef2151b9ae2729504823103f9c2e7130c]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220110125822] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:63aa69ba85114b9a7e55d1e8cf4c7c5ef2151b9ae2729504823103f9c2e7130c])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:63aa69ba85114b9a7e55d1e8cf4c7c5ef2151b9ae2729504823103f9c2e7130c
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:63aa69ba85114b9a7e55d1e8cf4c7c5ef2151b9ae2729504823103f9c2e7130c
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:63aa69ba85114b9a7e55d1e8cf4c7c5ef2151b9ae2729504823103f9c2e7130c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 5m 12s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/r75wcrrleeucw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #206

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/206/display/redirect>

Changes:


------------------------------------------
[...truncated 49.62 KB...]
59d5b5a212d5: Preparing
0923754a550c: Preparing
2722fe1fa18d: Preparing
36d4b61891b3: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
59d5b5a212d5: Waiting
26a504e63be4: Waiting
31892cc314cb: Waiting
f9e18e59a565: Waiting
3bb5258f46d2: Waiting
11936051f93b: Waiting
140b8d74f702: Waiting
832e177bb500: Waiting
0923754a550c: Waiting
7fd4d9613c12: Waiting
36d4b61891b3: Waiting
2722fe1fa18d: Waiting
e935230b7b37: Waiting
81ec05982540: Waiting
a63813f52acb: Pushed
bd8bb9523d67: Pushed
47e8e840661e: Pushed
140b8d74f702: Pushed
81ec05982540: Pushed
7b677137a2dd: Pushed
721d4902504f: Pushed
59d5b5a212d5: Pushed
2722fe1fa18d: Pushed
3bb5258f46d2: Layer already exists
7fd4d9613c12: Pushed
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
31892cc314cb: Layer already exists
8bf42db0de72: Layer already exists
e935230b7b37: Pushed
11936051f93b: Layer already exists
36d4b61891b3: Pushed
0923754a550c: Pushed
20220109124334: digest: sha256:21470cf6ac2a11570c9e965019fca4a3334847485e2c986a2b77a13b8f1ec5ce size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 09, 2022 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 09, 2022 12:45:29 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 09, 2022 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 09, 2022 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 09, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 09, 2022 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 09, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 9fdaaec445b9463556b4ee46e1c99c3406a4c287e2f35a5cd83183e9e4c0f9b8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-n9quxEW5RjVWtO5G4cmcNAakwofi81pc2DGD6eTA-bg.pb
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 09, 2022 12:45:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f5b9db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@507d64aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37045b48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60b34931, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4aa21f9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9]
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 09, 2022 12:45:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5854a18, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5556bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@791c12e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b112b13, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24eb65e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb]
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 09, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 09, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-09_04_45_34-13620770890621873228?project=apache-beam-testing
Jan 09, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-09_04_45_34-13620770890621873228
Jan 09, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-09_04_45_34-13620770890621873228
Jan 09, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-09T12:45:41.886Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-z7xc. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 09, 2022 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:44.864Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.526Z: Expanding SplittableParDo operations into optimizable parts.
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.553Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.622Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.698Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.726Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.792Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.901Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.925Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:45.959Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.004Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.036Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.068Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.103Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.136Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.169Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.203Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.236Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.272Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.304Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.336Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.367Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.401Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.428Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.454Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.501Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.532Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.557Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.590Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 09, 2022 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.627Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 09, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:45:46.988Z: Starting 5 ****s in us-central1-f...
Jan 09, 2022 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:46:18.284Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 09, 2022 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:46:36.994Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 09, 2022 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:47:30.914Z: Workers have started successfully.
Jan 09, 2022 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T12:47:30.947Z: Workers have started successfully.
Jan 09, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T16:00:37.453Z: Cancel request is committed for workflow job: 2022-01-09_04_45_34-13620770890621873228.
Jan 09, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T16:00:37.528Z: Cleaning up.
Jan 09, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T16:00:37.599Z: Stopping **** pool...
Jan 09, 2022 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T16:00:37.651Z: Stopping **** pool...
Jan 09, 2022 4:02:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T16:02:59.197Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 09, 2022 4:02:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-09T16:02:59.236Z: Worker pool stopped.
Jan 09, 2022 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-09_04_45_34-13620770890621873228 finished with status CANCELLED.
Load test results for test (ID): 268b1372-c261-4e4c-9154-7396d7688cd4 and timestamp: 2022-01-09T12:45:29.116000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11553.163
dataflow_v2_java11_total_bytes_count             1.63210408E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220109124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:21470cf6ac2a11570c9e965019fca4a3334847485e2c986a2b77a13b8f1ec5ce
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220109124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:21470cf6ac2a11570c9e965019fca4a3334847485e2c986a2b77a13b8f1ec5ce]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220109124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:21470cf6ac2a11570c9e965019fca4a3334847485e2c986a2b77a13b8f1ec5ce])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:21470cf6ac2a11570c9e965019fca4a3334847485e2c986a2b77a13b8f1ec5ce
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:21470cf6ac2a11570c9e965019fca4a3334847485e2c986a2b77a13b8f1ec5ce
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:21470cf6ac2a11570c9e965019fca4a3334847485e2c986a2b77a13b8f1ec5ce].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 47s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/h4u7ylchxk52w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #205

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/205/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [adhoc] Fix BigTableIO description

[noreply] [BEAM-13015] Remove dead code now that all instances have migrated to

[noreply] [BEAM-13386] Add RLock support for cloudpickle (#16250)

[danthev] Fix overflow


------------------------------------------
[...truncated 49.30 KB...]
5f73317e9c84: Preparing
5dced2c18203: Preparing
c6511c243ebf: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
feb4bb1c7f43: Waiting
5f73317e9c84: Waiting
40a438875f1a: Waiting
5dced2c18203: Waiting
5c89e53604b4: Waiting
26a504e63be4: Waiting
c6511c243ebf: Waiting
8bf42db0de72: Waiting
4e6ede92f183: Waiting
31892cc314cb: Waiting
f9e18e59a565: Waiting
3bb5258f46d2: Waiting
c5c90b910af3: Waiting
832e177bb500: Waiting
11936051f93b: Waiting
da3429be8715: Pushed
2fdee7c17392: Pushed
a66adcd28cba: Pushed
acdd8856e218: Pushed
feb4bb1c7f43: Pushed
77e5c10da981: Pushed
4e6ede92f183: Pushed
5c89e53604b4: Pushed
c5c90b910af3: Pushed
40a438875f1a: Pushed
832e177bb500: Layer already exists
3bb5258f46d2: Layer already exists
5dced2c18203: Pushed
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
c6511c243ebf: Pushed
5f73317e9c84: Pushed
20220108124333: digest: sha256:778efe755b70f21de23933fbc0772b54ab528f99893fab28dc90a2a79f0b0f49 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 08, 2022 12:45:18 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 08, 2022 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 08, 2022 12:45:19 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 08, 2022 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 08, 2022 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 08, 2022 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 08, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 08, 2022 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 61d14fb95e9bb2607751fb988bd43c314371eb02718da255f9e20dff8224af63> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YdFPuV6bsmB3UfuYi9Q8MUNx6wJxjaJV-eIN_4Ikr2M.pb
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 08, 2022 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f5b9db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@507d64aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37045b48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60b34931, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4aa21f9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59]
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 08, 2022 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5854a18, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5556bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@791c12e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b112b13, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24eb65e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f]
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 08, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 08, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-08_04_45_24-5049803997360451121?project=apache-beam-testing
Jan 08, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-08_04_45_24-5049803997360451121
Jan 08, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-08_04_45_24-5049803997360451121
Jan 08, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-08T12:45:33.683Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-hzso. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 08, 2022 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:39.437Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.058Z: Expanding SplittableParDo operations into optimizable parts.
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.112Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.200Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.256Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.299Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.355Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.452Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.478Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.510Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.542Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.568Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.597Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.620Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.651Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.688Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.729Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.777Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.811Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.835Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.865Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.897Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.919Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.947Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:40.981Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:41.015Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:41.048Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:41.077Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:41.121Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:41.155Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 08, 2022 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:41.532Z: Starting 5 ****s in us-central1-f...
Jan 08, 2022 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:45:49.267Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 08, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:46:31.705Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 08, 2022 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:47:26.836Z: Workers have started successfully.
Jan 08, 2022 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T12:47:26.861Z: Workers have started successfully.
Jan 08, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T16:00:27.604Z: Cancel request is committed for workflow job: 2022-01-08_04_45_24-5049803997360451121.
Jan 08, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T16:00:28.485Z: Cleaning up.
Jan 08, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T16:00:28.554Z: Stopping **** pool...
Jan 08, 2022 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T16:00:28.615Z: Stopping **** pool...
Jan 08, 2022 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T16:02:53.930Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 08, 2022 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-08T16:02:53.973Z: Worker pool stopped.
Jan 08, 2022 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-08_04_45_24-5049803997360451121 finished with status CANCELLED.
Load test results for test (ID): 485e827d-dfe8-4cf5-8114-013bd14af19c and timestamp: 2022-01-08T12:45:19.244000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11544.857
dataflow_v2_java11_total_bytes_count             1.90950472E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220108124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:778efe755b70f21de23933fbc0772b54ab528f99893fab28dc90a2a79f0b0f49
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220108124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:778efe755b70f21de23933fbc0772b54ab528f99893fab28dc90a2a79f0b0f49]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220108124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:778efe755b70f21de23933fbc0772b54ab528f99893fab28dc90a2a79f0b0f49])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:778efe755b70f21de23933fbc0772b54ab528f99893fab28dc90a2a79f0b0f49
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:778efe755b70f21de23933fbc0772b54ab528f99893fab28dc90a2a79f0b0f49
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:778efe755b70f21de23933fbc0772b54ab528f99893fab28dc90a2a79f0b0f49].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 44s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ziymue32vbx2i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #204

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/204/display/redirect?page=changes>

Changes:

[heejong] [BEAM-13091] Generate missing staged names from hash for Dataflow runner

[heejong] add test

[arietis27] [BEAM-13604] NPE while getting null from BigDecimal column

[noreply] Fixed empty labels treated as wildcard when matching cache files

[noreply] [BEAM-13570] Remove erroneous compileClasspath dependency. (#16438)

[noreply] [BEAM-13015] Plumb through process wide and bundle cache through the

[noreply] [BEAM-13015] Cache the state backed iterable used for large GBK results.

[noreply] Fix formatting/alignment (#16443)

[noreply] Merge pull request #16183 from [BEAM-13427] [Playground]  show logs for

[noreply] [BEAM-10277] re-write encoding position tests to declare schema protos

[noreply] Update local_env_tests.yml (#16444)

[noreply] [BEAM-13574] Filesystem abstraction Rename support (#16428)

[noreply] [BEAM-13597] Setup Go in github actions (#16446)

[noreply] Merge pull request #16161 from [BEAM-12164] Add Spanner Partition

[noreply] Merge pull request #16203 from [BEAM-12164] Add Spanner Change Stream


------------------------------------------
[...truncated 72.66 KB...]
SEVERE: 2022-01-07T15:31:25.287Z: Staged package avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar' is inaccessible.
Jan 07, 2022 3:31:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:25.952Z: Staged package beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar' is inaccessible.
Jan 07, 2022 3:31:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:26.008Z: Staged package beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar' is inaccessible.
Jan 07, 2022 3:31:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:26.115Z: Staged package checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar' is inaccessible.
Jan 07, 2022 3:31:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:26.289Z: Staged package commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar' is inaccessible.
Jan 07, 2022 3:31:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:26.329Z: Staged package commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar' is inaccessible.
Jan 07, 2022 3:31:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:26.372Z: Staged package commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:27.553Z: Staged package hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:27.589Z: Staged package hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:27.718Z: Staged package j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:27.817Z: Staged package jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:27.960Z: Staged package jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:28.032Z: Staged package javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:28.103Z: Staged package json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:31:28.146Z: Staged package jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar' is inaccessible.
Jan 07, 2022 3:31:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:31:29.115Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:34:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:34:28.791Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:37:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:25.292Z: Staged package annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar' is inaccessible.
Jan 07, 2022 3:37:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:25.457Z: Staged package avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar' is inaccessible.
Jan 07, 2022 3:37:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:26.419Z: Staged package beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar' is inaccessible.
Jan 07, 2022 3:37:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:26.481Z: Staged package beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar' is inaccessible.
Jan 07, 2022 3:37:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:26.570Z: Staged package checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar' is inaccessible.
Jan 07, 2022 3:37:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:27.156Z: Staged package commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar' is inaccessible.
Jan 07, 2022 3:37:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:27.201Z: Staged package commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar' is inaccessible.
Jan 07, 2022 3:37:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:27.268Z: Staged package commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar' is inaccessible.
Jan 07, 2022 3:37:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:28.755Z: Staged package hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar' is inaccessible.
Jan 07, 2022 3:37:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:28.820Z: Staged package hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar' is inaccessible.
Jan 07, 2022 3:37:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:28.916Z: Staged package j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar' is inaccessible.
Jan 07, 2022 3:37:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:29.016Z: Staged package jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar' is inaccessible.
Jan 07, 2022 3:37:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:29.360Z: Staged package jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar' is inaccessible.
Jan 07, 2022 3:37:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:29.609Z: Staged package javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar' is inaccessible.
Jan 07, 2022 3:37:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:29.854Z: Staged package json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar' is inaccessible.
Jan 07, 2022 3:37:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:37:29.978Z: Staged package jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar' is inaccessible.
Jan 07, 2022 3:37:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:37:32.581Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:40:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:40:29.293Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:25.127Z: Staged package annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar' is inaccessible.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:25.238Z: Staged package avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar' is inaccessible.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:25.711Z: Staged package beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar' is inaccessible.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:25.760Z: Staged package beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar' is inaccessible.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:25.817Z: Staged package checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar' is inaccessible.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:25.960Z: Staged package commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar' is inaccessible.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:26.008Z: Staged package commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar' is inaccessible.
Jan 07, 2022 3:43:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:26.077Z: Staged package commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.241Z: Staged package hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.286Z: Staged package hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.393Z: Staged package j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.462Z: Staged package jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.540Z: Staged package jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.597Z: Staged package javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.655Z: Staged package json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar' is inaccessible.
Jan 07, 2022 3:43:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:43:27.713Z: Staged package jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar' is inaccessible.
Jan 07, 2022 3:43:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:43:28.675Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:46:29.751Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:25.297Z: Staged package annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar' is inaccessible.
Jan 07, 2022 3:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:25.461Z: Staged package avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:26.100Z: Staged package beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:26.201Z: Staged package beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:26.277Z: Staged package checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:26.541Z: Staged package commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:26.593Z: Staged package commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:26.638Z: Staged package commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.080Z: Staged package hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.117Z: Staged package hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.226Z: Staged package j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.316Z: Staged package jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.421Z: Staged package jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar' is inaccessible.
Jan 07, 2022 3:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.498Z: Staged package javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar' is inaccessible.
Jan 07, 2022 3:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.577Z: Staged package json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar' is inaccessible.
Jan 07, 2022 3:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:49:28.623Z: Staged package jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar' is inaccessible.
Jan 07, 2022 3:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:49:29.579Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:52:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:52:28.668Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:55:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:25.178Z: Staged package annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar' is inaccessible.
Jan 07, 2022 3:55:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:25.290Z: Staged package avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:25.852Z: Staged package beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:25.942Z: Staged package beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:26.003Z: Staged package checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:26.171Z: Staged package commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:26.220Z: Staged package commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:26.260Z: Staged package commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.424Z: Staged package hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.463Z: Staged package hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.540Z: Staged package j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar' is inaccessible.
Jan 07, 2022 3:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.624Z: Staged package jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar' is inaccessible.
Jan 07, 2022 3:55:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.750Z: Staged package jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar' is inaccessible.
Jan 07, 2022 3:55:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.822Z: Staged package javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar' is inaccessible.
Jan 07, 2022 3:55:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.891Z: Staged package json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar' is inaccessible.
Jan 07, 2022 3:55:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-07T15:55:27.930Z: Staged package jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar' is inaccessible.
Jan 07, 2022 3:55:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:55:28.825Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 3:58:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-07T15:58:28.051Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 07, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-07T16:00:38.587Z: Cancel request is committed for workflow job: 2022-01-07_04_46_10-1307797473656761051.
Jan 07, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-07T16:00:38.616Z: Cleaning up.
Jan 07, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-07T16:00:38.729Z: Stopping **** pool...
Jan 07, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-07T16:00:38.774Z: Stopping **** pool...
Jan 07, 2022 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-07T16:03:08.023Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 07, 2022 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-07T16:03:08.059Z: Worker pool stopped.
Jan 07, 2022 4:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-07_04_46_10-1307797473656761051 finished with status CANCELLED.
Load test results for test (ID): bce6d2ba-68fd-497d-9bce-739be181233d and timestamp: 2022-01-07T12:46:04.841000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11518.205
dataflow_v2_java11_total_bytes_count             1.94104724E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220107124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a821f866feb3e6e5eba153a7bc47c3f79432b832f6e971a4b93562ad0000e1ef
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220107124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a821f866feb3e6e5eba153a7bc47c3f79432b832f6e971a4b93562ad0000e1ef]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220107124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a821f866feb3e6e5eba153a7bc47c3f79432b832f6e971a4b93562ad0000e1ef])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a821f866feb3e6e5eba153a7bc47c3f79432b832f6e971a4b93562ad0000e1ef
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a821f866feb3e6e5eba153a7bc47c3f79432b832f6e971a4b93562ad0000e1ef
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a821f866feb3e6e5eba153a7bc47c3f79432b832f6e971a4b93562ad0000e1ef].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37b32517fc226bf927301945f30602de44aa7b6ba8fef662169084699aaa409d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37b32517fc226bf927301945f30602de44aa7b6ba8fef662169084699aaa409d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37b32517fc226bf927301945f30602de44aa7b6ba8fef662169084699aaa409d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m
109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/h57a7gdzssfa4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #203

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/203/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13565][Playground]

[relax] update bom

[zyichi] Upgrade python library versions in base_image_requirements.txt

[noreply] [BEAM-13567] Consolidate runner flag definition. (#16426)

[noreply] [BEAM-13601] Don't cache Row types for a schema. (#16427)

[noreply] [BEAM-13430] Re-enable checkerframework (#16429)

[noreply] [BEAM-13430] Ensure that testRuntimeMigration depends on "default"

[noreply] Merge pull request #16277 from [BEAM-13124][Playground] Create readiness

[noreply] Merge pull request #16314 from [BEAM-13260][Playground]Implement setup

[noreply] Merge pull request #16383 from [BEAM-13566][Playground] Add logic of

[noreply] Merge pull request #16365 from [BEAM-13559][Playground] Remove tag in

[noreply] Merge pull request #16360 from [BEAM-13546][Playground] Update nginx

[noreply] Merge pull request #16192 from [BEAM-13395] [Playground] Tag katas

[noreply] Merge pull request #16254 from [BEAM-13249][Playground] Security – Mock

[ningkang0957] [BEAM-12879] Prevented missing permission from failing GCS I/O

[noreply] Merge pull request #16347: fix: move connector to use v1 BigQuery

[noreply] [BEAM-13603] Fix bug in apache_beam.utils.Shared (#16437)

[noreply] [BEAM-10345] Add an import guard to support recent google-cloud-spanner


------------------------------------------
[...truncated 51.85 KB...]
4c5e589408a3: Pushed
624025d2ed82: Pushed
4afc3431b8f8: Pushed
43cd40da8f13: Pushed
7d7a8ba090e9: Pushed
b566a0eabf87: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
26a504e63be4: Layer already exists
f9e18e59a565: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
79ddb0a130ec: Pushed
1924bfd0f3af: Pushed
20220106124353: digest: sha256:7f52ed10080141871f8f1b9dadc6532c506db584bffa32184de9a3797118e358 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 06, 2022 12:46:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 06, 2022 12:46:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 06, 2022 12:46:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 06, 2022 12:46:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 06, 2022 12:46:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 06, 2022 12:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 06, 2022 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 06, 2022 12:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 6301f6c8c50de146b3a0c72d195b67f15dfa9917b926c0f5639a261f70315053> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YwH2yMUN4UazoMctGVtn8V36mRe5JsD1Y5omH3AxUFM.pb
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 06, 2022 12:46:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f5b9db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@507d64aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37045b48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60b34931, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4aa21f9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71c17a57, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@640ab13c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0a864d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@440e3ce6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e67f5f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd53053, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4527f70a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@707b1a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7132a9dc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57435801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da66a44, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@527fc8e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@61bfc9bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c7106d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@329bad59]
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 06, 2022 12:46:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5854a18, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5556bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@791c12e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b112b13, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24eb65e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ac3f6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1abebef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18f55704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67cefd84, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5fbe155, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6add8e3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58a2b917, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48904d5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12bbfc54, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1491344a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59b65dce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1386313f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e922647, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@433c6abb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288f173f]
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 06, 2022 12:46:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 06, 2022 12:46:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-06_04_46_29-15093586055417001318?project=apache-beam-testing
Jan 06, 2022 12:46:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-06_04_46_29-15093586055417001318
Jan 06, 2022 12:46:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-06_04_46_29-15093586055417001318
Jan 06, 2022 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-06T12:46:35.741Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-gj7z. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:40.836Z: Worker configuration: e2-standard-2 in us-central1-b.
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.587Z: Expanding SplittableParDo operations into optimizable parts.
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.623Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.691Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.759Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.790Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.848Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.943Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.963Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:41.991Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.026Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.055Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.092Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.125Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.160Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.189Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.228Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.254Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.293Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.321Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.357Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.391Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.428Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.457Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.489Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.517Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.540Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.576Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.600Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.623Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 06, 2022 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:46:42.978Z: Starting 5 ****s in us-central1-b...
Jan 06, 2022 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:47:14.485Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 06, 2022 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:47:27.332Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 06, 2022 12:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:48:25.062Z: Workers have started successfully.
Jan 06, 2022 12:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T12:48:25.084Z: Workers have started successfully.
Jan 06, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T16:00:37.181Z: Cancel request is committed for workflow job: 2022-01-06_04_46_29-15093586055417001318.
Jan 06, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T16:00:37.255Z: Cleaning up.
Jan 06, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T16:00:37.346Z: Stopping **** pool...
Jan 06, 2022 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T16:00:37.388Z: Stopping **** pool...
Jan 06, 2022 4:03:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T16:03:11.288Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 06, 2022 4:03:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-06T16:03:11.326Z: Worker pool stopped.
Jan 06, 2022 4:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-06_04_46_29-15093586055417001318 finished with status CANCELLED.
Load test results for test (ID): 467bee2d-801a-4407-af20-dc76119bc6fc and timestamp: 2022-01-06T12:46:23.194000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11480.679
dataflow_v2_java11_total_bytes_count             2.30969047E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220106124353
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7f52ed10080141871f8f1b9dadc6532c506db584bffa32184de9a3797118e358
Deleted: sha256:4f0c8dd62010c7b740a495c7a45f82b0ef2322fd895d07567835fce856aa6183
Deleted: sha256:ac027c8ca90cb6281d43f5ed50bf44ef763f1782c8fdc8fcdb558b01191695a7
Deleted: sha256:0e5d4f704fa527c144ea31b7b44b97c9ad8184c999e3ee044866c3536d5f0127
Deleted: sha256:1fa1782c4f13b990085c19d17183b4f621db092b29bc60a7fcd7c25abb7794fa
Deleted: sha256:530a6f1aceeab2304bd089494782de85462cd25e8d602fb719354106c80c103d
Deleted: sha256:cddbd957eff53019d5c937092dfbec47b8a7be7428c24e0ca8305f99ee5e8f86
Deleted: sha256:fd4030329332b6e389dce3d8c019f63cef53b5bd2b1b104849eb0a1167e35544
Deleted: sha256:5ee896ac8a97283acd5634f6167da6e021edbaa12e1581038fb3f8642e25a854
Deleted: sha256:511a69fed95c551d40b04579be5f7325ecea875774bc596885dc771aedcdbff2
Deleted: sha256:bb71266021c9c917764351ba822df3000eedc1225c0505e9f3a6fc28d2941463
Deleted: sha256:b5c3a61b873a77324676327948339ef48a810bae8f77ece828c6d51de5514ca1
Deleted: sha256:ff910c9fb6c8ce2f5ede334b4b5fd78dfb6772c85dcb52bbb90dbd7a1fdd2ca1
Deleted: sha256:ddecaf6aa8de6c035efeaf30adffdc3950f1be71579b205914695531f6f2894e
Deleted: sha256:292f8c4dc0defbc6b0bac2d2db94463e656043c753401fc051ea5a88ebea8b93
Deleted: sha256:51a20a8377bdaf0cb7b3893e079d88b024f495dd2516cf1811677c757637637a
Deleted: sha256:77376927f2061a6532a530950d08c52798abe1d41cefda0adf51bce95e5731e6
Deleted: sha256:524d0d40f838f2bd23bbe0db116df95f8674769ba895c480e4aa6bde63c8ea52
Deleted: sha256:57640528df0d00e646a0f691e310f897c95901080ef19506c6dd54b905e26c4f
Deleted: sha256:9144ac81ac9609827cd864d16535a97de0e57ccce7ab9c63650865f56f18b77d
Deleted: sha256:0e08e3df55736c3f54c05f4496e325c9c5304e92b8ce36993c54c958e250b63d
Deleted: sha256:61c7cf51ce09490d95f2a89b85e8dff99ebe80f2218c88f664878b4e85c3becb
Deleted: sha256:30f67045b612b9d892e1f0a3658160535fdda8afe338d7d9a86e1f991efd8567
Deleted: sha256:ac7a5c607a07ae3785929b60cf2576495fdbe93e71c46ab2b09befb51087239c
Deleted: sha256:ef049b5a0c9532dd72f54d95d17b111df6ed2eb2c06c8d7e5a54e1ca70a85df4
Deleted: sha256:f1ccd0a8861b3e7d39a944b894045dd6d03121329f339512b2caf0f8d5f4f057
Deleted: sha256:668ac4accb8e4840fcefc537ad2658324a0bd8f58a99d5b7595a2093b77f038f
Deleted: sha256:d7b4193782ff7eccf75676e67ddafa094a1f7cb86b3eab5257c4be45ce772784
Deleted: sha256:23b0a8d2a04f9a6cf29004c79b21aa16cafcf5dfb0696a4fc00bec6a1039582d
Deleted: sha256:11a66b97b7b8e6c3fee30d3b5db72136ce85a4192ef9946790882b079e7bfd73
Deleted: sha256:b9d3614407d0ef9a58f1c8fe787cbb0a04bc4eb97789c128d99d48a627eeb024
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220106124353]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7f52ed10080141871f8f1b9dadc6532c506db584bffa32184de9a3797118e358]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220106124353] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7f52ed10080141871f8f1b9dadc6532c506db584bffa32184de9a3797118e358])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7f52ed10080141871f8f1b9dadc6532c506db584bffa32184de9a3797118e358
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7f52ed10080141871f8f1b9dadc6532c506db584bffa32184de9a3797118e358
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7f52ed10080141871f8f1b9dadc6532c506db584bffa32184de9a3797118e358].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 51s
109 actionable tasks: 75 executed, 30 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/74xt2rkwmsfeu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #202

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/202/display/redirect?page=changes>

Changes:

[johnjcasey] [BEAM-12391] update avro sink to close the opened file handle, instead

[Robert Bradshaw] [BEAM-13482] Python fully qualified name external transforms.

[Robert Bradshaw] Add Python expansion service entry point.

[Kyle Weaver] [BEAM-13569] Change Spark dependencies to implementation.

[Kyle Weaver] remove redundant dependency

[wuren] [BEAM-13591] Bump log4j2 version to 2.17.1

[Steve Niemitz] [BEAM-13459] Update CHANGES.md, add note about artifact caching for

[relax] Add Flink runner support for OrderedListState. This version reads the

[noreply] Merge pull request #16404: [BEAM-13586] Fix NPE in

[zyichi] Fix sdk_container_builder too many values to unpack error

[noreply] [BEAM-13480] Sickbay PubSubIntegrationTest.test_streaming_data_only on

[Kyle Weaver] remove redundant testImplementation dependencies

[noreply] [BEAM-13430] Swap to use "mainClass" instead of "main" since it was

[zyichi] Fix remaining failing perf IT tests.

[noreply] [BEAM-13430] Replace deprecated "appendix" with "archiveAppendix"

[noreply] [BEAM-13015] Add jamm as a java agent to the Java SDK harness container

[noreply] [BEAM-13430] Partially revert

[noreply] Merge pull request #15863 from [BEAM-13184] Autosharding for

[noreply] [BEAM-11936] Enable FloatingPointAssertionWithinEpsilon errorprone check

[noreply] [BEAM-11936] Enable LockNotBeforeTry errorprone check (#16259)

[noreply] [BEAM-11936] Enable errorprone unused checks (#16262)

[noreply] Add Nexmark Query 14 (#16337)

[noreply] [BEAM-13015] Migrate all user state and side implementations to support

[noreply] [BEAM-13015] Use 20% of memory when the maximum has been configured.


------------------------------------------
[...truncated 44.98 KB...]
a38a8df97f6a: Pushed
c440b437d45a: Pushed
57d5327a42da: Pushed
bc62d2a7ce2e: Pushed
7cadca06c395: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
c0662f28aced: Pushed
8bf42db0de72: Layer already exists
26a504e63be4: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
b2578895a203: Pushed
2814bc0217ea: Pushed
20220105124332: digest: sha256:7137665db55d01d50172fb0d4768f2e6cecb79cbdb098517f0d09d97ed136594 size: 4520

> Task :sdks:java:testing:load-tests:run
Jan 05, 2022 12:45:15 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 05, 2022 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 05, 2022 12:45:17 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 05, 2022 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 05, 2022 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 05, 2022 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 05, 2022 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 05, 2022 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash a934baf319f10e53779cc9bfdcdccc462b838ae8f63b3e6abe0b516f090b9865> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-qTS68xnxDlN3nMm_3NzMRiuDiuj2Oz5qvgtRbwkLmGU.pb
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 05, 2022 12:45:21 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bc33720, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dd0f797, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67064bdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a7fd0c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18578491, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3291b443, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@671c4166, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53d13cd4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77865933, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@480ad82c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d18b73a]
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 05, 2022 12:45:21 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@506a1372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7332a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77c233af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b56ac7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c41ec0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35a0e495, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5399f6c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fe64d23, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58437801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76464795]
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 05, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Jan 05, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-05_04_45_21-6676539346797392784?project=apache-beam-testing
Jan 05, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-05_04_45_21-6676539346797392784
Jan 05, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-05_04_45_21-6676539346797392784
Jan 05, 2022 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-05T12:45:28.967Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-z6cy. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:32.578Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.056Z: Expanding SplittableParDo operations into optimizable parts.
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.093Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.162Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.222Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.247Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.293Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.390Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.422Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.450Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.487Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.513Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.538Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.565Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.598Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.646Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.674Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.699Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.730Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.764Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.791Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.816Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.844Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.889Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.916Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.942Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.966Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:33.999Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:34.028Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:34.052Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 05, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:45:34.418Z: Starting 5 ****s in us-central1-f...
Jan 05, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:46:05.938Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 05, 2022 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:46:19.489Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 05, 2022 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:47:17.842Z: Workers have started successfully.
Jan 05, 2022 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T12:47:17.866Z: Workers have started successfully.
Jan 05, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T16:00:39.869Z: Cancel request is committed for workflow job: 2022-01-05_04_45_21-6676539346797392784.
Jan 05, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T16:00:39.929Z: Cleaning up.
Jan 05, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T16:00:39.992Z: Stopping **** pool...
Jan 05, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T16:00:40.039Z: Stopping **** pool...
Jan 05, 2022 4:03:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T16:03:10.078Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 05, 2022 4:03:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-05T16:03:10.118Z: Worker pool stopped.
Jan 05, 2022 4:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-05_04_45_21-6676539346797392784 finished with status CANCELLED.
Load test results for test (ID): 7d9982c9-e062-4434-8d46-088216e681a0 and timestamp: 2022-01-05T12:45:16.697000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11560.481
dataflow_v2_java11_total_bytes_count             2.39643879E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220105124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7137665db55d01d50172fb0d4768f2e6cecb79cbdb098517f0d09d97ed136594
Deleted: sha256:0e827a4d980825e26c5c2d56e97857d739ed2083b8b0bae68a00ab6c2c3ee4e5
Deleted: sha256:a2bc780c0d88c20578f8478923934034ee2b8db57268b617dbcff046f35390c3
Deleted: sha256:ceedaf99168457ed04f9ca6f4368bb6a85f6d29b5bed7f4fbe922d1b90d6a93b
Deleted: sha256:2eb017db841e379f44308ebeace411f3f2146ad4f4e979cf6bef3e8b06c8639f
Deleted: sha256:0de3629b3f4ba3e6fb1f8b546717c6a78d25234647807d6b82ca5a6a2f3465bd
Deleted: sha256:de8eac2c5c4221c075ede7b88932157bcf403249f0fc3ea6bce7ea530daa921c
Deleted: sha256:7bec4ca7b92d6695cf4bb4919cafb33e9d7255424178b6a18f1d9547df02f778
Deleted: sha256:b8f000818eda283484bf53065768916386eeb0a1cfe405af7be7d812647d526d
Deleted: sha256:7aca4e5bee946b1e3ca042223e082d039fc71de7f45a7497d38873c05e46ef1c
Deleted: sha256:aee8816cfe97a735614a447381e4f04412baf27339e3b89c77fca7f258cceacc
Deleted: sha256:f986bd77d4926c81c0d51e2a54b8732f9dd911dd052aa497812afade5a445325
Deleted: sha256:eac72d85b2309e63174950be2fc74df8e708c141bc3a388c06aefa4799682af5
Deleted: sha256:f288495dac0edc1c231f30a6159b7ffa5d6699b25ac500ac2467d3676b3ddd59
Deleted: sha256:79355896f4cd65c1cdddb9e18c30ea145f08d9e89d56c7b6d73166c5edb7f542
Deleted: sha256:93c438fb125a42390afaeaf99ba53574c670bcc2b1bc6b5dac037f39705a5d30
Deleted: sha256:0ae842051a97172e946be6f46cbab586397f6af0d3ad3230ed5cd75396ec4965
Deleted: sha256:c1a4d7ce85f46b3853b9c218b8eea9ff68f32152ca132758213cef52c8a83127
Deleted: sha256:e76d1dea2fb20793674124805ef68677075c732b6bf3275a11c385e973c17014
Deleted: sha256:69467249d4830dbd28046693c5ed269dcc7ca60410d413e6d14548902a3135f0
Deleted: sha256:7792e9df2c213d348e2542ed66c17cde09a08b1db4fda4088a087d8c17898227
Deleted: sha256:3eed3985600bb8b28d1895d7a94c5895739a6d0e7f34dc520fcb06030e80f0b1
Deleted: sha256:f0c858696712ad971724a21e5f6f47df7e0bd3b862f9923b0701b40ad9bceb12
Deleted: sha256:a7b22ea0277298f7d43635c744a9a85e0c09b624e390707259c0e565292d2c72
Deleted: sha256:8b862980342be1a308902c7a1885ceb58b40e36623935ced063bc37da6472b05
Deleted: sha256:895e62a38d150202e15385318db8db8f8e44afa4348d28c883b6cde0045d26ec
Deleted: sha256:a910b330e8b70842f24a7ce6373a4ca19733cfb78912decb0e3abd92bbf27889
Deleted: sha256:de3c5ae05298cbae335e35b5c92661b7727c76b500402d8745f4f2f8fedbe09f
Deleted: sha256:22de5b1eb69cf2f296e0b5282de0331f459cf2a1441cbfc1c8b9b5fafabc83f3
Deleted: sha256:7b6fd1572752f26347ba56276ed613b0bd46dc7e5f2fecab6a5116e553eb8eab
Deleted: sha256:e53624c39170eacc9d4d86446bf7d93350f65ea7b31295441710b0c483e7af3c
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220105124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7137665db55d01d50172fb0d4768f2e6cecb79cbdb098517f0d09d97ed136594]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220105124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7137665db55d01d50172fb0d4768f2e6cecb79cbdb098517f0d09d97ed136594])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7137665db55d01d50172fb0d4768f2e6cecb79cbdb098517f0d09d97ed136594
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7137665db55d01d50172fb0d4768f2e6cecb79cbdb098517f0d09d97ed136594
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7137665db55d01d50172fb0d4768f2e6cecb79cbdb098517f0d09d97ed136594].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 20m 2s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/23ltmn2wictwa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #201

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/201/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13509] Stop sharing SQS client between readers of same source.

[mmack] [BEAM-13587] Attempt to load AWS region from default provider chain in


------------------------------------------
[...truncated 44.17 KB...]
3d39fcb7121f: Preparing
cfd8f0905ca0: Preparing
ecb9d918359f: Preparing
9969548bfc57: Preparing
c3d139bb1290: Preparing
9ed35a672ef7: Preparing
3b6a23a082e5: Preparing
bed8de9ba397: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
3bb5258f46d2: Waiting
f9e18e59a565: Preparing
26a504e63be4: Preparing
832e177bb500: Waiting
8bf42db0de72: Preparing
31892cc314cb: Preparing
26a504e63be4: Waiting
8bf42db0de72: Waiting
11936051f93b: Preparing
11936051f93b: Waiting
31892cc314cb: Waiting
9969548bfc57: Waiting
3b6a23a082e5: Waiting
bed8de9ba397: Waiting
ecb9d918359f: Waiting
cfd8f0905ca0: Waiting
e1874121dd0a: Pushed
3d39fcb7121f: Pushed
4fc88b0bbcec: Pushed
7040f79cdc38: Pushed
cfd8f0905ca0: Pushed
6d04881c087e: Pushed
9969548bfc57: Pushed
c3d139bb1290: Pushed
3b6a23a082e5: Pushed
3bb5258f46d2: Layer already exists
bed8de9ba397: Pushed
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
8bf42db0de72: Layer already exists
ecb9d918359f: Pushed
26a504e63be4: Layer already exists
11936051f93b: Layer already exists
31892cc314cb: Layer already exists
9ed35a672ef7: Pushed
20220104124340: digest: sha256:11440593d05b58c555056a779165a75590dae6404ce10696b92ab5de6927ed10 size: 4311

> Task :sdks:java:testing:load-tests:run
Jan 04, 2022 12:45:53 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 04, 2022 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 04, 2022 12:45:54 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 04, 2022 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 04, 2022 12:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 04, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 04, 2022 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 04, 2022 12:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 76afdf79b87ee40efcfb8518451e84fd1d988cfc1ec422fa47936d06b0aa21d4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dq_febh-5A78-4UYRR6E_R2YjPwexCL6R5NtBrCqIdQ.pb
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 04, 2022 12:45:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bc33720, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dd0f797, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67064bdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a7fd0c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18578491, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3291b443, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@671c4166, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53d13cd4]
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 04, 2022 12:45:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@506a1372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7332a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77c233af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b56ac7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c41ec0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35a0e495, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5399f6c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fe64d23]
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 04, 2022 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 04, 2022 12:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Jan 04, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-04_04_45_59-6945613796395576760?project=apache-beam-testing
Jan 04, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-04_04_45_59-6945613796395576760
Jan 04, 2022 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-04_04_45_59-6945613796395576760
Jan 04, 2022 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-04T12:46:07.084Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-n2tk. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 04, 2022 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:12.572Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.204Z: Expanding SplittableParDo operations into optimizable parts.
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.237Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.298Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.388Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.406Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.470Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.604Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.648Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.695Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.736Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.774Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.797Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.836Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.902Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.937Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:13.972Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.006Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.051Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.078Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.108Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.134Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.167Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.212Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.246Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.270Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.294Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.325Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.360Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:14.389Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 04, 2022 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:15.083Z: Starting 5 ****s in us-central1-f...
Jan 04, 2022 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:46:37.350Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 04, 2022 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:47:02.627Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 04, 2022 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:47:59.109Z: Workers have started successfully.
Jan 04, 2022 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T12:47:59.135Z: Workers have started successfully.
Jan 04, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T16:00:35.198Z: Cancel request is committed for workflow job: 2022-01-04_04_45_59-6945613796395576760.
Jan 04, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T16:00:35.295Z: Cleaning up.
Jan 04, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T16:00:35.372Z: Stopping **** pool...
Jan 04, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T16:00:35.425Z: Stopping **** pool...
Jan 04, 2022 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T16:02:51.554Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 04, 2022 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-04T16:02:51.588Z: Worker pool stopped.
Jan 04, 2022 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-04_04_45_59-6945613796395576760 finished with status CANCELLED.
Load test results for test (ID): 47e51832-f5cf-400c-ae48-f6bbcc6c458f and timestamp: 2022-01-04T12:45:53.883000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11526.081
dataflow_v2_java11_total_bytes_count             2.64100198E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220104124340
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11440593d05b58c555056a779165a75590dae6404ce10696b92ab5de6927ed10
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220104124340]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11440593d05b58c555056a779165a75590dae6404ce10696b92ab5de6927ed10]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220104124340] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11440593d05b58c555056a779165a75590dae6404ce10696b92ab5de6927ed10])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11440593d05b58c555056a779165a75590dae6404ce10696b92ab5de6927ed10
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11440593d05b58c555056a779165a75590dae6404ce10696b92ab5de6927ed10
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11440593d05b58c555056a779165a75590dae6404ce10696b92ab5de6927ed10].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 38s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wuqby3xz6dyns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #200

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/200/display/redirect>

Changes:


------------------------------------------
[...truncated 44.13 KB...]
9c08266a8aba: Preparing
377ecb17b8f1: Preparing
01f796727ffe: Preparing
ddb9abc2a84b: Preparing
0af330436921: Preparing
377ecb17b8f1: Waiting
01f796727ffe: Waiting
111b68a388aa: Preparing
ddb9abc2a84b: Waiting
f49adac5f6e1: Preparing
111b68a388aa: Waiting
407cbcefd7c1: Preparing
0af330436921: Waiting
f49adac5f6e1: Waiting
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
3bb5258f46d2: Waiting
407cbcefd7c1: Waiting
8bf42db0de72: Preparing
31892cc314cb: Preparing
26a504e63be4: Waiting
832e177bb500: Waiting
11936051f93b: Preparing
f9e18e59a565: Waiting
37c283b36270: Pushed
c69cc6eeddd0: Pushed
9c08266a8aba: Pushed
d76da2a02430: Pushed
377ecb17b8f1: Pushed
08a31831c26c: Pushed
ddb9abc2a84b: Pushed
0af330436921: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
407cbcefd7c1: Pushed
f49adac5f6e1: Pushed
01f796727ffe: Pushed
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
111b68a388aa: Pushed
20220103124334: digest: sha256:5c4801e630e008392fca95d09a29febc044bef86325e63ab06d801c3521caa8a size: 4311

> Task :sdks:java:testing:load-tests:run
Jan 03, 2022 12:45:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 03, 2022 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 03, 2022 12:45:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 03, 2022 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 03, 2022 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 03, 2022 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 1 seconds
Jan 03, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 03, 2022 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash fa094f55c8dd8d82ce23e78305adb82261940beec6d9dfdab5605f134da6b11c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--glPVcjdjYLOI-eDBa24ImGUC-7G2d_atWBfE02msRw.pb
Jan 03, 2022 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 03, 2022 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bc33720, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dd0f797, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67064bdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a7fd0c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18578491, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3291b443, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@671c4166, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53d13cd4]
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 03, 2022 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@506a1372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7332a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77c233af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b56ac7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c41ec0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35a0e495, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5399f6c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fe64d23]
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 03, 2022 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Jan 03, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-03_04_45_24-3301662855178043808?project=apache-beam-testing
Jan 03, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-03_04_45_24-3301662855178043808
Jan 03, 2022 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-03_04_45_24-3301662855178043808
Jan 03, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-03T12:45:31.400Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-piq. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:35.491Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.173Z: Expanding SplittableParDo operations into optimizable parts.
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.205Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.261Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.320Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.350Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.420Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.513Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.560Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.591Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.624Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.658Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.690Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.715Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.738Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.788Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.814Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 03, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.846Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.877Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.907Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.931Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.965Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:36.998Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.031Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.056Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.087Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.141Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.173Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.206Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.233Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 03, 2022 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:45:37.709Z: Starting 5 ****s in us-central1-f...
Jan 03, 2022 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:46:05.936Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 03, 2022 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:46:31.550Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 03, 2022 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:47:26.301Z: Workers have started successfully.
Jan 03, 2022 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T12:47:26.328Z: Workers have started successfully.
Jan 03, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T16:00:35.036Z: Cancel request is committed for workflow job: 2022-01-03_04_45_24-3301662855178043808.
Jan 03, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T16:00:35.088Z: Cleaning up.
Jan 03, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T16:00:35.150Z: Stopping **** pool...
Jan 03, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T16:00:35.193Z: Stopping **** pool...
Jan 03, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T16:02:57.207Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 03, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-03T16:02:57.244Z: Worker pool stopped.
Jan 03, 2022 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-03_04_45_24-3301662855178043808 finished with status CANCELLED.
Load test results for test (ID): 24d069d5-f680-4866-b469-ae572d5588ba and timestamp: 2022-01-03T12:45:17.811000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11548.556
dataflow_v2_java11_total_bytes_count             2.11090174E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220103124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5c4801e630e008392fca95d09a29febc044bef86325e63ab06d801c3521caa8a
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220103124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5c4801e630e008392fca95d09a29febc044bef86325e63ab06d801c3521caa8a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220103124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5c4801e630e008392fca95d09a29febc044bef86325e63ab06d801c3521caa8a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5c4801e630e008392fca95d09a29febc044bef86325e63ab06d801c3521caa8a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5c4801e630e008392fca95d09a29febc044bef86325e63ab06d801c3521caa8a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5c4801e630e008392fca95d09a29febc044bef86325e63ab06d801c3521caa8a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 48s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/r6pvf2kehh5sm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 199 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 199 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/199/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #198

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/198/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13541] Fix spotbugs issue with DLS in tail (#16407)

[noreply] [BEAM-13588] Add missing module for PVR tests and also sickbay known

[noreply] [BEAM-13575] Sickbay test that is flaky to restore precommit test signal

[noreply] [BEAM-12092] Bump jedis to version 3.8.0 (#16403)


------------------------------------------
[...truncated 44.69 KB...]
31892cc314cb: Preparing
11936051f93b: Preparing
31892cc314cb: Waiting
11936051f93b: Waiting
9efaf169ab33: Pushed
09ee4e8c9a7a: Pushed
adaa8aefeb93: Pushed
0fa3bfc97b21: Pushed
ce84906a8d75: Pushed
da52bf965b1e: Pushed
71afe8bec404: Pushed
876244654db6: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
b3625b798ce3: Pushed
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
fd6ec0e53c9a: Pushed
f4e94253fbf6: Pushed
da2b13271c97: Pushed
20220101124331: digest: sha256:5448f5165a3efe69bdb1b7278c1c1e2a1ab61b67f49f412c9707c4402e724f9b size: 4311

> Task :sdks:java:testing:load-tests:run
Jan 01, 2022 12:45:15 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 01, 2022 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Jan 01, 2022 12:45:16 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 01, 2022 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jan 01, 2022 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jan 01, 2022 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Jan 01, 2022 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 01, 2022 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash f4c201c582559cf176342a2737864b67d6f0279d30ce7162996a65249d5642a2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9MIBxYJVnPF2NConN4ZLZ9bwJ50wznFimWplJJ1WQqI.pb
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jan 01, 2022 12:45:21 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bc33720, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dd0f797, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67064bdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a7fd0c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18578491, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3291b443, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@671c4166, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53d13cd4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77865933, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@480ad82c]
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jan 01, 2022 12:45:21 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@506a1372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7332a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77c233af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b56ac7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c41ec0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35a0e495, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5399f6c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fe64d23, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58437801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0]
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 01, 2022 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Jan 01, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-01_04_45_21-16412038114127075776?project=apache-beam-testing
Jan 01, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-01-01_04_45_21-16412038114127075776
Jan 01, 2022 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-01-01_04_45_21-16412038114127075776
Jan 01, 2022 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T12:45:28.911Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-01-nwc4. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 01, 2022 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:33.711Z: Worker configuration: e2-standard-2 in us-central1-f.
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.503Z: Expanding SplittableParDo operations into optimizable parts.
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.533Z: Expanding CollectionToSingleton operations into optimizable parts.
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.601Z: Expanding CoGroupByKey operations into optimizable parts.
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.674Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.713Z: Expanding GroupByKey operations into streaming Read/Write steps
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.782Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.882Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.930Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.961Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:34.994Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.021Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.043Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.070Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.104Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.138Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.171Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.206Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.245Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.280Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.310Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.342Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.375Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.409Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.454Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.489Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.519Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.553Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.588Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:35.621Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jan 01, 2022 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:36.048Z: Starting 5 ****s in us-central1-f...
Jan 01, 2022 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:45:47.653Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 01, 2022 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:46:26.897Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jan 01, 2022 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:47:22.288Z: Workers have started successfully.
Jan 01, 2022 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T12:47:22.316Z: Workers have started successfully.
Jan 01, 2022 1:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-01T13:45:38.432Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Jan 01, 2022 1:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T13:45:39.541Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 01, 2022 1:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T13:48:39.675Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 01, 2022 1:51:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-01T13:51:38.751Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Jan 01, 2022 1:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T13:51:39.906Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 01, 2022 1:54:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T13:54:39.497Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 01, 2022 1:57:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-01T13:57:38.630Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Jan 01, 2022 1:57:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T13:57:39.794Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 01, 2022 2:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T14:00:39.488Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 01, 2022 2:03:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-01-01T14:03:38.599Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Jan 01, 2022 2:03:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-01-01T14:03:39.574Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Jan 01, 2022 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T16:00:31.897Z: Cancel request is committed for workflow job: 2022-01-01_04_45_21-16412038114127075776.
Jan 01, 2022 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T16:00:31.946Z: Cleaning up.
Jan 01, 2022 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T16:00:32.058Z: Stopping **** pool...
Jan 01, 2022 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T16:00:32.114Z: Stopping **** pool...
Jan 01, 2022 4:02:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T16:02:59.295Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jan 01, 2022 4:02:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-01-01T16:02:59.327Z: Worker pool stopped.
Jan 01, 2022 4:03:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-01-01_04_45_21-16412038114127075776 finished with status CANCELLED.
Load test results for test (ID): 5f434bb9-4a16-460c-9420-6672f7023320 and timestamp: 2022-01-01T12:45:16.445000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11562.529
dataflow_v2_java11_total_bytes_count             1.98463947E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220101124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5448f5165a3efe69bdb1b7278c1c1e2a1ab61b67f49f412c9707c4402e724f9b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220101124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5448f5165a3efe69bdb1b7278c1c1e2a1ab61b67f49f412c9707c4402e724f9b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220101124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5448f5165a3efe69bdb1b7278c1c1e2a1ab61b67f49f412c9707c4402e724f9b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5448f5165a3efe69bdb1b7278c1c1e2a1ab61b67f49f412c9707c4402e724f9b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5448f5165a3efe69bdb1b7278c1c1e2a1ab61b67f49f412c9707c4402e724f9b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5448f5165a3efe69bdb1b7278c1c1e2a1ab61b67f49f412c9707c4402e724f9b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 51s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ikjufpq4rbezs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #197

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/197/display/redirect?page=changes>

Changes:

[stranniknm] [BEAM-13555] playground - quick fix for safari

[mmack] [BEAM-13009] Fix tests to ensure absence of duplicates only per request

[mmack] [adhoc] Use predefined min/max combiners in SQS reader to gather stats

[zyichi] Fix class not found in perf integration tests.

[noreply] Update

[noreply] [BEAM-13430] Fix class not found error for example integration tests and

[noreply] [BEAM-13430] Re-enable dependency analysis for modules. (#16395)

[noreply] [BEAM-13484][Playground] Improve Terraform Scripts For Deploy

[noreply] [BEAM-13541] More intelligent caching of CoGBK values. (#16354)

[noreply] Merge pull request #15769 from [BEAM-13031] [Playground] Code editor -


------------------------------------------
[...truncated 99.45 KB...]
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:41.976Z: Staged package google-api-services-pubsub-v1-rev20211012-1.32.1-tlbE3NRPfq1Tei5jlklIHI3_Lr4AjKiDD53olYlLfVE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-pubsub-v1-rev20211012-1.32.1-tlbE3NRPfq1Tei5jlklIHI3_Lr4AjKiDD53olYlLfVE.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.048Z: Staged package google-api-services-storage-v1-rev20211018-1.32.1-2gWjhhV23GKwelCtsAd9ctESRIU6uPJIsOmeUs_0hi8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-storage-v1-rev20211018-1.32.1-2gWjhhV23GKwelCtsAd9ctESRIU6uPJIsOmeUs_0hi8.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.100Z: Staged package google-auth-library-credentials-1.2.1-nCGtJZrqUXMZnlzIo91vLcP41fMbfGg44IcF2O_Y5qA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-auth-library-credentials-1.2.1-nCGtJZrqUXMZnlzIo91vLcP41fMbfGg44IcF2O_Y5qA.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.144Z: Staged package google-auth-library-oauth2-http-1.2.1-ymQyaBiipFjTUxnvljub7iaq-AcvCqv2UqS-k06SUV0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-auth-library-oauth2-http-1.2.1-ymQyaBiipFjTUxnvljub7iaq-AcvCqv2UqS-k06SUV0.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.190Z: Staged package google-cloud-bigquery-2.3.3-h5H0Z_KojTQDv3i8rjpgEY4RPPFWSFRnGLhqpvwVOmc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigquery-2.3.3-h5H0Z_KojTQDv3i8rjpgEY4RPPFWSFRnGLhqpvwVOmc.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.233Z: Staged package google-cloud-bigquerystorage-2.4.2-5YVJ7WVXPG5s8grIJNjOCJLzuAXvvV7hycFi9NxJ_Ac.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigquerystorage-2.4.2-5YVJ7WVXPG5s8grIJNjOCJLzuAXvvV7hycFi9NxJ_Ac.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.270Z: Staged package google-cloud-bigtable-2.2.0-KABz6OcxVcBW19P_ZXYkYmoNylNzEKunlIxSVP6ryik.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigtable-2.2.0-KABz6OcxVcBW19P_ZXYkYmoNylNzEKunlIxSVP6ryik.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.312Z: Staged package google-cloud-core-2.2.0-bg2jk3pGS8qiBzYFrbcz7Adg9ZCCcWQWfGjbckAfX_g.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-core-2.2.0-bg2jk3pGS8qiBzYFrbcz7Adg9ZCCcWQWfGjbckAfX_g.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.356Z: Staged package google-cloud-core-grpc-2.2.0-25EdFn1IvmF3iXOnwS1gTe1otKU1ONZ8yntYNSwEM1A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-core-grpc-2.2.0-25EdFn1IvmF3iXOnwS1gTe1otKU1ONZ8yntYNSwEM1A.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.398Z: Staged package google-cloud-core-http-2.2.0-m-jbHJU73e-M-rojYZiTQYKX-gb3ZWoffl24bynuFpk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-core-http-2.2.0-m-jbHJU73e-M-rojYZiTQYKX-gb3ZWoffl24bynuFpk.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.436Z: Staged package google-cloud-firestore-3.0.6-0H7ybq6QzTpRy9fPhvcrT77cj81RRbcyO9SXDS7ozxk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-firestore-3.0.6-0H7ybq6QzTpRy9fPhvcrT77cj81RRbcyO9SXDS7ozxk.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.480Z: Staged package google-cloud-pubsub-1.114.7-woRLOVDIiIpw8CLOmWlYCs2gKrDJr_HFlXd_UHco4Hs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-pubsub-1.114.7-woRLOVDIiIpw8CLOmWlYCs2gKrDJr_HFlXd_UHco4Hs.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.560Z: Staged package google-cloud-spanner-6.14.0-tQ8OcjcjMFVRBiw6mkwxf2e43T9agLgRzEBanWqjlvc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-spanner-6.14.0-tQ8OcjcjMFVRBiw6mkwxf2e43T9agLgRzEBanWqjlvc.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.622Z: Staged package google-http-client-1.40.1-BpXaPzhHg4yT7ID8hldkBl5hADVfKxmOZjz_JIwGcmM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-1.40.1-BpXaPzhHg4yT7ID8hldkBl5hADVfKxmOZjz_JIwGcmM.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.664Z: Staged package google-http-client-apache-v2-1.40.1-cs8kwbGkAltMXtc7wVYlUFJEZNA-uRLq2kM6-mjL1Kc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-apache-v2-1.40.1-cs8kwbGkAltMXtc7wVYlUFJEZNA-uRLq2kM6-mjL1Kc.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.725Z: Staged package google-http-client-appengine-1.40.1-NF6cBtmubRxM6pmMKfLCGa9VpaudAxDIh7imCPupxNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-appengine-1.40.1-NF6cBtmubRxM6pmMKfLCGa9VpaudAxDIh7imCPupxNk.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.763Z: Staged package google-http-client-gson-1.40.1-4LLTtTR1cge1KtAXql5h0N-TN73CsLiiP2bGxKNiqDE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-gson-1.40.1-4LLTtTR1cge1KtAXql5h0N-TN73CsLiiP2bGxKNiqDE.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.795Z: Staged package google-http-client-jackson2-1.40.1-Z6N6POQYvgaGZyJkrLI07CaXlh8rylsLMY0DfYqo9Kg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-jackson2-1.40.1-Z6N6POQYvgaGZyJkrLI07CaXlh8rylsLMY0DfYqo9Kg.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.832Z: Staged package google-http-client-protobuf-1.40.1-TtuwwZAI-ILzLP08Iu0cYugu62RsDofESZAOrus5EFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-protobuf-1.40.1-TtuwwZAI-ILzLP08Iu0cYugu62RsDofESZAOrus5EFU.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.869Z: Staged package google-oauth-client-1.32.1-BjA55K1f9S_AagXi8aDjR87ky_Eeqbea63qbRdfG2_I.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-oauth-client-1.32.1-BjA55K1f9S_AagXi8aDjR87ky_Eeqbea63qbRdfG2_I.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.911Z: Staged package grpc-alts-1.41.0-buUDV4vPyIeTwOEME8gfxOF1vd15_qcfFK0lKyG_RkI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-alts-1.41.0-buUDV4vPyIeTwOEME8gfxOF1vd15_qcfFK0lKyG_RkI.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.947Z: Staged package grpc-api-1.41.0-aCBqGaxOsCYRSqPna6z64TGcWLvghzVNK5Ble6brd0g.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-api-1.41.0-aCBqGaxOsCYRSqPna6z64TGcWLvghzVNK5Ble6brd0g.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:42.991Z: Staged package grpc-auth-1.41.0-XXIg-0qjPOJgMrqBiBimY3HRI2979J4ZCElMIJBOiW4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-auth-1.41.0-XXIg-0qjPOJgMrqBiBimY3HRI2979J4ZCElMIJBOiW4.jar' is inaccessible.
Dec 31, 2021 1:09:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.062Z: Staged package grpc-context-1.41.0-DjVVMFHhq-1W33SPICLwdYBznLUyUGaeNSxLBdTu9iQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-context-1.41.0-DjVVMFHhq-1W33SPICLwdYBznLUyUGaeNSxLBdTu9iQ.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.103Z: Staged package grpc-core-1.41.0-DKE1JbG9j-inB3fVeARoifvpuFLIn5hpkMYmQHSW0hI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-core-1.41.0-DKE1JbG9j-inB3fVeARoifvpuFLIn5hpkMYmQHSW0hI.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.148Z: Staged package grpc-gcp-1.1.0-zKq--BAvLo2Cks8hjNm7iNASsOWFeFMqwc55hcc1YEY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-gcp-1.1.0-zKq--BAvLo2Cks8hjNm7iNASsOWFeFMqwc55hcc1YEY.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.188Z: Staged package grpc-google-cloud-bigquerystorage-v1-2.4.2-rcyu5coRb3EhnZYcqVFKVTBwhLq_0TtMhHMWm0lJ__w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigquerystorage-v1-2.4.2-rcyu5coRb3EhnZYcqVFKVTBwhLq_0TtMhHMWm0lJ__w.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.229Z: Staged package grpc-google-cloud-bigquerystorage-v1beta1-0.128.2-ete21VWpi5GseyDR_FnEThEmAJf71-MqB14Ln51KLvo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigquerystorage-v1beta1-0.128.2-ete21VWpi5GseyDR_FnEThEmAJf71-MqB14Ln51KLvo.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.265Z: Staged package grpc-google-cloud-bigquerystorage-v1beta2-0.128.2-B4C0HfWYubAE_53ivzw1sJKsKD4I5NHY_wMvwDqUOpc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigquerystorage-v1beta2-0.128.2-B4C0HfWYubAE_53ivzw1sJKsKD4I5NHY_wMvwDqUOpc.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.311Z: Staged package grpc-google-cloud-bigtable-admin-v2-2.2.0-LkV5bnp7fgLAzTBX7LgKIvs7DlKD7G0Q01lin_rH8ac.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigtable-admin-v2-2.2.0-LkV5bnp7fgLAzTBX7LgKIvs7DlKD7G0Q01lin_rH8ac.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.394Z: Staged package grpc-google-cloud-bigtable-v2-2.2.0-qgo1zO_IHIy4Dmca1kB7VggPE7yqRbbiCTZPenixFlw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigtable-v2-2.2.0-qgo1zO_IHIy4Dmca1kB7VggPE7yqRbbiCTZPenixFlw.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.433Z: Staged package grpc-google-cloud-pubsub-v1-1.96.7-tvXiMnIk_9ysKh_V9bdga6iTlt80SFH6JmDfEHfNURA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-pubsub-v1-1.96.7-tvXiMnIk_9ysKh_V9bdga6iTlt80SFH6JmDfEHfNURA.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.477Z: Staged package grpc-google-cloud-spanner-admin-database-v1-6.14.0-N3sEkaCh7OdyURtjtvlCmBDKIfy88Q5BnFF76usTO6M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-spanner-admin-database-v1-6.14.0-N3sEkaCh7OdyURtjtvlCmBDKIfy88Q5BnFF76usTO6M.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.531Z: Staged package grpc-google-cloud-spanner-admin-instance-v1-6.14.0-ZsNtJIROHYhwMWP8en-Xz2Yurf-b1E5euM7Jg1cPe5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-spanner-admin-instance-v1-6.14.0-ZsNtJIROHYhwMWP8en-Xz2Yurf-b1E5euM7Jg1cPe5k.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.567Z: Staged package grpc-google-cloud-spanner-v1-6.14.0-qVyAMCy3RymhF9rEgQ_4t3QI1S9iq3eM6w_nyYb4pg0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-spanner-v1-6.14.0-qVyAMCy3RymhF9rEgQ_4t3QI1S9iq3eM6w_nyYb4pg0.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.603Z: Staged package grpc-google-cloud-storage-v2-2.0.1-alpha-lJsdHAHrV8TzJsKX-hrfXVKVS7MsFS0C6BBA6Sf7sIc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-storage-v2-2.0.1-alpha-lJsdHAHrV8TzJsKX-hrfXVKVS7MsFS0C6BBA6Sf7sIc.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.646Z: Staged package grpc-google-common-protos-2.6.0-NPLxjU4ENdg2phomClAqI2VfxW_qfHU28jEvsuXycbs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-common-protos-2.6.0-NPLxjU4ENdg2phomClAqI2VfxW_qfHU28jEvsuXycbs.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.688Z: Staged package grpc-grpclb-1.41.0-KzVoSsMCeWN_1GtLlTN_sXQP2s-JtSi9-oj-lmuX8Jw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-grpclb-1.41.0-KzVoSsMCeWN_1GtLlTN_sXQP2s-JtSi9-oj-lmuX8Jw.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.728Z: Staged package grpc-netty-1.41.0-FukMMrsnufcSkq76WASP0ibUqpSw0FrV2kCc-D12PJo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-netty-1.41.0-FukMMrsnufcSkq76WASP0ibUqpSw0FrV2kCc-D12PJo.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.771Z: Staged package grpc-netty-shaded-1.41.0-lJ8OF6tIvAkRhUWA-Zva4PJSFIJmwcD5Z0e1r6dAZUk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-netty-shaded-1.41.0-lJ8OF6tIvAkRhUWA-Zva4PJSFIJmwcD5Z0e1r6dAZUk.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.809Z: Staged package grpc-protobuf-1.41.0-Q4P84ABZaYAiAyfh9sB-L4VbcQjRuaYCd5495YRneZY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-protobuf-1.41.0-Q4P84ABZaYAiAyfh9sB-L4VbcQjRuaYCd5495YRneZY.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.871Z: Staged package grpc-stub-1.41.0-FBU9bDpjIwB1rwrfBLYKTFjcQBMkLll5y_P17xO-uLU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-stub-1.41.0-FBU9bDpjIwB1rwrfBLYKTFjcQBMkLll5y_P17xO-uLU.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:43.950Z: Staged package guava-31.0.1-jre-1b6U1l6HvSGfsxk60VF7qlWjuI_JHSHPc1gmq1rwh7k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/guava-31.0.1-jre-1b6U1l6HvSGfsxk60VF7qlWjuI_JHSHPc1gmq1rwh7k.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:44.042Z: Staged package httpclient-4.5.13-b-kCalZsalABYIzz_DIZZkH2weXhmG0QN8zb1fMe90M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/httpclient-4.5.13-b-kCalZsalABYIzz_DIZZkH2weXhmG0QN8zb1fMe90M.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:44.079Z: Staged package httpcore-4.4.14--VYgnkUMsdDFF3bfvSPlPp3Y25oSmO1itwvwlEumOyg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/httpcore-4.4.14--VYgnkUMsdDFF3bfvSPlPp3Y25oSmO1itwvwlEumOyg.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:44.217Z: Staged package jackson-annotations-2.13.0-gflyTYhD6LCPj2wGCeeisDDQDDSGHErH4gmacjUEfW8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-annotations-2.13.0-gflyTYhD6LCPj2wGCeeisDDQDDSGHErH4gmacjUEfW8.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:44.245Z: Staged package jackson-core-2.13.0-NIvFmzSN8ugHs1bx1i0q-0GpdAczKKvHc-sJMrhV0sg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-2.13.0-NIvFmzSN8ugHs1bx1i0q-0GpdAczKKvHc-sJMrhV0sg.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:44.320Z: Staged package jackson-databind-2.13.0-nIJtJxdiaHd63Pl-HG4gUcfjOnqqLDcMLoxgd_2do_Q.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-databind-2.13.0-nIJtJxdiaHd63Pl-HG4gUcfjOnqqLDcMLoxgd_2do_Q.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:44.378Z: Staged package jackson-dataformat-yaml-2.13.0-qrbF97bznPcp-z7JnadDcAu6rQdd6kHQZE6N8EyPnqM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-yaml-2.13.0-qrbF97bznPcp-z7JnadDcAu6rQdd6kHQZE6N8EyPnqM.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:44.725Z: Staged package metrics-core-3.1.2-JFuipmqbxxDOTbFHERJud7y05tlu9-YiZZKA88kMu1w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/metrics-core-3.1.2-JFuipmqbxxDOTbFHERJud7y05tlu9-YiZZKA88kMu1w.jar' is inaccessible.
Dec 31, 2021 1:09:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.115Z: Staged package proto-google-cloud-bigquerystorage-v1-2.4.2-I8k8jupfFMXmd3oCRu97y-SArCaAGuc9S9lAKYz6aNY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1-2.4.2-I8k8jupfFMXmd3oCRu97y-SArCaAGuc9S9lAKYz6aNY.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.154Z: Staged package proto-google-cloud-bigquerystorage-v1beta1-0.128.2-rY58mb0Q3U3wXkzhrxUkoFiELXER0H_ZeBM56mk09gY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1beta1-0.128.2-rY58mb0Q3U3wXkzhrxUkoFiELXER0H_ZeBM56mk09gY.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.186Z: Staged package proto-google-cloud-bigquerystorage-v1beta2-0.128.2-QS2p7NzODxTEWnvg5_0_0qQkS3IcYzxDPom7NrGk2PU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1beta2-0.128.2-QS2p7NzODxTEWnvg5_0_0qQkS3IcYzxDPom7NrGk2PU.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.225Z: Staged package proto-google-cloud-bigtable-admin-v2-2.2.0-roMg0XgAMQ2FIQF6CZxvH9TaB4kHQlkbpPnfZ33Qgh8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigtable-admin-v2-2.2.0-roMg0XgAMQ2FIQF6CZxvH9TaB4kHQlkbpPnfZ33Qgh8.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.271Z: Staged package proto-google-cloud-bigtable-v2-2.2.0-eQuZNGvDK3rktI8lIFj3RceyFRnCQZQ7kzsVL_jefpk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigtable-v2-2.2.0-eQuZNGvDK3rktI8lIFj3RceyFRnCQZQ7kzsVL_jefpk.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.315Z: Staged package proto-google-cloud-datastore-v1-0.92.3-GAooTTW0iEb04NomkkOPT5sN-SkixSjTSTIbKBgWRmU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-datastore-v1-0.92.3-GAooTTW0iEb04NomkkOPT5sN-SkixSjTSTIbKBgWRmU.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.360Z: Staged package proto-google-cloud-firestore-bundle-v1-3.0.6-hMoeG-RDGv3XTTCFZdGmw_KrJY-8n-l4umeZu-mJuH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-firestore-bundle-v1-3.0.6-hMoeG-RDGv3XTTCFZdGmw_KrJY-8n-l4umeZu-mJuH0.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.394Z: Staged package proto-google-cloud-firestore-v1-3.0.6--VBOSAcR_FEKBkitnXBJzdARz49V4U2Go2hDVvYVyn8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-firestore-v1-3.0.6--VBOSAcR_FEKBkitnXBJzdARz49V4U2Go2hDVvYVyn8.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.440Z: Staged package proto-google-cloud-pubsub-v1-1.96.7-ELakveR57_uj4HA8dVD0JE-uvJ2vQ5t62sVSbSQLqkA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-pubsub-v1-1.96.7-ELakveR57_uj4HA8dVD0JE-uvJ2vQ5t62sVSbSQLqkA.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.495Z: Staged package proto-google-cloud-spanner-admin-database-v1-6.14.0-svBhrFylwOEpV1RV3tyxpohFIg7ZauQu7ILOheRZHyE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-admin-database-v1-6.14.0-svBhrFylwOEpV1RV3tyxpohFIg7ZauQu7ILOheRZHyE.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.540Z: Staged package proto-google-cloud-spanner-admin-instance-v1-6.14.0-6ZF4J-m2Y8168Yzax45VkVJ1tPALrotea0sjlS7mLsc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-admin-instance-v1-6.14.0-6ZF4J-m2Y8168Yzax45VkVJ1tPALrotea0sjlS7mLsc.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.579Z: Staged package proto-google-cloud-spanner-v1-6.14.0-ygyOMRjsWchGubRL9kRWduJyW5sn-6MSNv_IbwPn6Bo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-v1-6.14.0-ygyOMRjsWchGubRL9kRWduJyW5sn-6MSNv_IbwPn6Bo.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.611Z: Staged package proto-google-cloud-storage-v2-2.0.1-alpha-hkn1Zre08VdXlAcFbsybY0a1fEj8QkQMcZ8KH3SE20Q.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-storage-v2-2.0.1-alpha-hkn1Zre08VdXlAcFbsybY0a1fEj8QkQMcZ8KH3SE20Q.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.648Z: Staged package proto-google-common-protos-2.6.0-6_Zu7qKrSyUfjKuUmLKZMDB5AvbypHYNyp9VWCtJCjM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-common-protos-2.6.0-6_Zu7qKrSyUfjKuUmLKZMDB5AvbypHYNyp9VWCtJCjM.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.692Z: Staged package proto-google-iam-v1-1.1.6-RDXWTySL7P9f8jyZL9WO8aLT9sN8SWxcV_HoEQET4l8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-iam-v1-1.1.6-RDXWTySL7P9f8jyZL9WO8aLT9sN8SWxcV_HoEQET4l8.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.737Z: Staged package protobuf-java-3.18.1-2wmd4uALeiExmzP6x8QPbYG6kTyyw7xBwbUatSrQreA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/protobuf-java-3.18.1-2wmd4uALeiExmzP6x8QPbYG6kTyyw7xBwbUatSrQreA.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.782Z: Staged package protobuf-java-util-3.18.1-Cxo2gtdJGmkoEno6pqyzGm8EZhdIRfwbKVUcw4KeK_k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/protobuf-java-util-3.18.1-Cxo2gtdJGmkoEno6pqyzGm8EZhdIRfwbKVUcw4KeK_k.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T13:09:45.871Z: Staged package snakeyaml-1.28-NURqFCFDXUXkxqwN47U3hSfVzCRGwHGD4kRHcwzh__o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snakeyaml-1.28-NURqFCFDXUXkxqwN47U3hSfVzCRGwHGD4kRHcwzh__o.jar' is inaccessible.
Dec 31, 2021 1:09:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-31T13:09:46.033Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 31, 2021 3:51:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T15:51:43.495Z: Staged package slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar' is inaccessible.
Dec 31, 2021 3:51:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T15:51:43.533Z: Staged package slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar' is inaccessible.
Dec 31, 2021 3:51:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T15:51:43.679Z: Staged package xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar' is inaccessible.
Dec 31, 2021 3:51:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-31T15:51:43.768Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 31, 2021 3:54:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-31T15:54:43.801Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 31, 2021 3:57:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T15:57:43.595Z: Staged package slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar' is inaccessible.
Dec 31, 2021 3:57:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T15:57:43.631Z: Staged package slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar' is inaccessible.
Dec 31, 2021 3:57:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-31T15:57:43.785Z: Staged package xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar' is inaccessible.
Dec 31, 2021 3:57:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-31T15:57:43.840Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 31, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-31T16:00:35.892Z: Cancel request is committed for workflow job: 2021-12-31_04_45_26-9839654050190384555.
Dec 31, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-31T16:00:35.925Z: Cleaning up.
Dec 31, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-31T16:00:35.996Z: Stopping **** pool...
Dec 31, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-31T16:00:36.041Z: Stopping **** pool...
Dec 31, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-31T16:02:56.246Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 31, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-31T16:02:56.280Z: Worker pool stopped.
Dec 31, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-31_04_45_26-9839654050190384555 finished with status CANCELLED.
Load test results for test (ID): f069d659-a75c-413e-ba89-1bcc4152f76d and timestamp: 2021-12-31T12:45:21.184000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11535.943
dataflow_v2_java11_total_bytes_count             2.26421956E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211231124330
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c0bb3cc854516926cb8435a412a0c55892191237abf60aa7984c1df29803623
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211231124330]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c0bb3cc854516926cb8435a412a0c55892191237abf60aa7984c1df29803623]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211231124330] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c0bb3cc854516926cb8435a412a0c55892191237abf60aa7984c1df29803623])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c0bb3cc854516926cb8435a412a0c55892191237abf60aa7984c1df29803623
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c0bb3cc854516926cb8435a412a0c55892191237abf60aa7984c1df29803623
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c0bb3cc854516926cb8435a412a0c55892191237abf60aa7984c1df29803623].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fac13068a9736398cc394221d973238172e98fb2089fe8b887697d60acb48351
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fac13068a9736398cc394221d973238172e98fb2089fe8b887697d60acb48351
ERROR: (gcloud.container.images.delete) Not found: response: {'status': '404', 'content-length': '168', 'x-xss-protection': '0', 'transfer-encoding': 'chunked', 'server': 'Docker Registry', '-content-encoding': 'gzip', 'docker-distribution-api-version': 'registry/2.0', 'cache-control': 'private', 'date': 'Fri, 31 Dec 2021 16:03:06 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
Failed to compute blob liveness for manifest: 'sha256:fac13068a9736398cc394221d973238172e98fb2089fe8b887697d60acb48351': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 291

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 48s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/45cb27nhwkqom

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #196

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/196/display/redirect?page=changes>

Changes:

[Steve Niemitz] [BEAM-13488] Work around bug in classpath detection

[noreply] [BEAM-13578] Validate that repeatable PipelineOptions need to be

[noreply] [BEAM-13430] Update comment over eclipse plugin. (#16387)

[zyichi] [BEAM-13430] Fix broken ULR validates runner tests

[noreply] Update venv creation commands in release scripts. (#16381)

[Steve Niemitz] [BEAM-13459] Cache artifacts across job runs on python+dataflow

[noreply] [BEAM-12777] Stable URL for current API docs (#16327)

[noreply] [BEAM-13539] correct resource override dependencies (#16392)

[Kyle Weaver] [BEAM-13581] Remove previous job name for Flink PVR precommit.

[emilyye] Moving to 2.37.0-SNAPSHOT on master branch.

[noreply] Release website update for Beam 2.35.0 (#16115)

[noreply] [BEAM-13430] Re-enable checkerframework for the project excluding the


------------------------------------------
[...truncated 49.37 KB...]
66f373940980: Preparing
cfcce9963fe0: Preparing
c52059f2ee5d: Preparing
723c528c5f78: Preparing
5f2203c3e31a: Preparing
333f47bdfc73: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
66f373940980: Waiting
11936051f93b: Preparing
cfcce9963fe0: Waiting
723c528c5f78: Waiting
26a504e63be4: Waiting
5f2203c3e31a: Waiting
8bf42db0de72: Waiting
333f47bdfc73: Waiting
31892cc314cb: Waiting
3bb5258f46d2: Waiting
11936051f93b: Waiting
c7f603f0058e: Waiting
c52059f2ee5d: Waiting
f9e18e59a565: Waiting
4f8762388591: Pushed
1a7e19cf6e3b: Pushed
9e5d7637c93f: Pushed
96424d3eac6c: Pushed
c7f603f0058e: Pushed
00f54578a665: Pushed
cfcce9963fe0: Pushed
c52059f2ee5d: Pushed
5f2203c3e31a: Pushed
3bb5258f46d2: Layer already exists
f9e18e59a565: Layer already exists
832e177bb500: Layer already exists
66f373940980: Pushed
8bf42db0de72: Layer already exists
26a504e63be4: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
333f47bdfc73: Pushed
723c528c5f78: Pushed
20211230124337: digest: sha256:d33024c0898b0bb10b7ff51d23ee24caeec181b344c0d9cbd0e4f05a2db98020 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 30, 2021 12:45:34 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 30, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 30, 2021 12:45:35 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 30, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 30, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 30, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 30, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 30, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash cf758e32b63019061a6d96d4d81202a9d026b7716ad88eb773fbd518ec0c0e5d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-z3WOMrYwGQYabZbU2BICqdAmt3Fq2I63c_vVGOwMDl0.pb
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 30, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f5b9db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@507d64aa]
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 30, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5854a18, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5556bf]
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.37.0-SNAPSHOT
Dec 30, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-30_04_45_41-14461742849939712139?project=apache-beam-testing
Dec 30, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-30_04_45_41-14461742849939712139
Dec 30, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-30_04_45_41-14461742849939712139
Dec 30, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-30T12:45:49.203Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-f8zw. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 30, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:54.834Z: Worker configuration: e2-standard-2 in us-central1-f.
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.476Z: Expanding SplittableParDo operations into optimizable parts.
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.509Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.579Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.670Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.701Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.762Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.873Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.907Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.941Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:55.983Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.010Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.033Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.061Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.088Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.147Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.181Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.217Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.265Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.299Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.323Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.350Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.384Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.413Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 30, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.438Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 30, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.484Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 30, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.545Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 30, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.666Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 30, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.727Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 30, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:56.805Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 30, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:45:57.426Z: Starting 5 ****s in us-central1-f...
Dec 30, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:46:07.696Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 30, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:46:47.057Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 30, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:47:41.179Z: Workers have started successfully.
Dec 30, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T12:47:41.215Z: Workers have started successfully.
Dec 30, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T16:00:39.363Z: Cancel request is committed for workflow job: 2021-12-30_04_45_41-14461742849939712139.
Dec 30, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T16:00:39.444Z: Cleaning up.
Dec 30, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T16:00:39.512Z: Stopping **** pool...
Dec 30, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T16:00:39.556Z: Stopping **** pool...
Dec 30, 2021 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T16:03:05.380Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 30, 2021 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-30T16:03:05.407Z: Worker pool stopped.
Dec 30, 2021 4:03:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-30_04_45_41-14461742849939712139 finished with status CANCELLED.
Load test results for test (ID): ec339e52-3ad6-427f-a5f4-330f277c2c74 and timestamp: 2021-12-30T12:45:34.941000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11514.611
dataflow_v2_java11_total_bytes_count             2.58986545E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211230124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d33024c0898b0bb10b7ff51d23ee24caeec181b344c0d9cbd0e4f05a2db98020
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211230124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d33024c0898b0bb10b7ff51d23ee24caeec181b344c0d9cbd0e4f05a2db98020]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211230124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d33024c0898b0bb10b7ff51d23ee24caeec181b344c0d9cbd0e4f05a2db98020])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d33024c0898b0bb10b7ff51d23ee24caeec181b344c0d9cbd0e4f05a2db98020
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d33024c0898b0bb10b7ff51d23ee24caeec181b344c0d9cbd0e4f05a2db98020
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d33024c0898b0bb10b7ff51d23ee24caeec181b344c0d9cbd0e4f05a2db98020].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 51s
109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7efeevipuddh4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #195

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/195/display/redirect?page=changes>

Changes:

[dpcollins] [BEAM-13402] Simplify PubsubLiteSink

[Kyle Weaver] [BEAM-13521] flink pvr batch precommit

[Kyle Weaver] [BEAM-13571] Fix ClassNotFound exception in Flink tests

[Kyle Weaver] [BEAM-13498] [BEAM-13573] exclude new tests on Flink

[noreply] Exclude UsesOnWindowExpiration by category from Dataflow v2 streaming

[noreply] [BEAM-13052] Increment pubsub python version and fix breakages. (#16126)

[noreply] [BEAM-13052] Add Pub/Sub Lite xlang transforms in python (#15727)

[noreply] [BEAM-13402] Version bump Pub/Sub Lite and implement changes to ensure


------------------------------------------
[...truncated 44.10 KB...]
26a504e63be4: Preparing
8bf42db0de72: Preparing
31892cc314cb: Preparing
11936051f93b: Preparing
e55d55f036ac: Waiting
26a504e63be4: Waiting
e4aada6129fc: Waiting
3bb5258f46d2: Waiting
7d14dd19b753: Waiting
832e177bb500: Waiting
79b056163f51: Waiting
8bf42db0de72: Waiting
31892cc314cb: Waiting
11936051f93b: Waiting
f9e18e59a565: Waiting
e27740273bd6: Waiting
eb9331e0c24e: Waiting
cf874dadde9b: Waiting
5b1831a6a4ca: Pushed
e5c2291c5cbf: Pushed
4bcb9b02dcdc: Pushed
e55d55f036ac: Pushed
e548b597699e: Pushed
18a9e383e81f: Pushed
7d14dd19b753: Pushed
79b056163f51: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
e27740273bd6: Pushed
e4aada6129fc: Pushed
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
8bf42db0de72: Layer already exists
eb9331e0c24e: Pushed
cf874dadde9b: Pushed
20211229124330: digest: sha256:c20b7d271e664dd827cac2faf0eb922ad88f35f9577ebcc865ed6598b9eba905 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 29, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 29, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 29, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 29, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 29, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 29, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 29, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 29, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 9646933fb259160f2eb74f74e47729f04be85b28fb01fd9eaec765609d1144be> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lkaTP7JZFg8ut0905Hcp8EvoWyj7Af2ersdlYJ0RRL4.pb
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 29, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bc33720, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dd0f797, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67064bdc]
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 29, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65d8dff8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444f44c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@303f1234, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24d61e4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2149594a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@506a1372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7332a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77c233af]
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 29, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-29_04_45_27-6039810460170652176?project=apache-beam-testing
Dec 29, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-29_04_45_27-6039810460170652176
Dec 29, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-29_04_45_27-6039810460170652176
Dec 29, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-29T12:45:36.200Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-j1t0. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:41.122Z: Worker configuration: e2-standard-2 in us-central1-f.
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:41.607Z: Expanding SplittableParDo operations into optimizable parts.
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:41.628Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:41.693Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:41.794Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:41.828Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:41.927Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.028Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.055Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.086Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.118Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.145Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.176Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.201Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.238Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 29, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.267Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.299Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.326Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.353Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.384Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.416Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.438Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.467Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.500Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.526Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.570Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.605Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.638Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.672Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:42.703Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 29, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:45:43.115Z: Starting 5 ****s in us-central1-f...
Dec 29, 2021 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:46:16.383Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 29, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:46:29.492Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 29, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:47:23.181Z: Workers have started successfully.
Dec 29, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T12:47:23.209Z: Workers have started successfully.
Dec 29, 2021 1:21:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-29T13:21:44.471Z: Staged package gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gcsio-2.2.4-p9i4-xCmDBUow6YptIUTZYGtZZhNat4tocR78u1Q8kQ.jar' is inaccessible.
Dec 29, 2021 1:21:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-29T13:21:44.679Z: Staged package google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-iamcredentials-v1-rev20210326-1.32.1-rzLlTs22sMZOFz3yEOfAB70ZZw2Gl99hr42n_k-f8-M.jar' is inaccessible.
Dec 29, 2021 1:21:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-29T13:21:46.855Z: Staged package util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/util-2.2.4-r-7-okFosTGJSr5au-fNqmC_EgCN_DyG3X-dgI2RZFA.jar' is inaccessible.
Dec 29, 2021 1:21:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-29T13:21:46.924Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 29, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T16:00:37.050Z: Cancel request is committed for workflow job: 2021-12-29_04_45_27-6039810460170652176.
Dec 29, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T16:00:37.110Z: Cleaning up.
Dec 29, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T16:00:37.237Z: Stopping **** pool...
Dec 29, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T16:00:37.443Z: Stopping **** pool...
Dec 29, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T16:02:59.333Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 29, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-29T16:02:59.363Z: Worker pool stopped.
Dec 29, 2021 4:03:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-29_04_45_27-6039810460170652176 finished with status CANCELLED.
Load test results for test (ID): 5a1105b4-d15f-429f-89c2-53954f8eddf1 and timestamp: 2021-12-29T12:45:22.700000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11556.931
dataflow_v2_java11_total_bytes_count              2.1040439E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211229124330
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c20b7d271e664dd827cac2faf0eb922ad88f35f9577ebcc865ed6598b9eba905
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211229124330]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c20b7d271e664dd827cac2faf0eb922ad88f35f9577ebcc865ed6598b9eba905]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211229124330] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c20b7d271e664dd827cac2faf0eb922ad88f35f9577ebcc865ed6598b9eba905])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c20b7d271e664dd827cac2faf0eb922ad88f35f9577ebcc865ed6598b9eba905
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c20b7d271e664dd827cac2faf0eb922ad88f35f9577ebcc865ed6598b9eba905
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c20b7d271e664dd827cac2faf0eb922ad88f35f9577ebcc865ed6598b9eba905].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 51s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/zc2qlpvipgvqk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #194

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/194/display/redirect?page=changes>

Changes:

[mmack] [adhoc] Forbid to import guava and others from org.testcontainers.shaded

[noreply] [BEAM-13526] Kafka.IO: make DeserializerProvider a public interface


------------------------------------------
[...truncated 44.10 KB...]
31892cc314cb: Preparing
11936051f93b: Preparing
4439fb655b0f: Waiting
da4ad5cd1a53: Waiting
3bb5258f46d2: Waiting
6f9b4de1510e: Waiting
832e177bb500: Waiting
b14acabcfd09: Waiting
11936051f93b: Waiting
f9e18e59a565: Waiting
31892cc314cb: Waiting
878ce1a685b1: Waiting
26a504e63be4: Waiting
79eca828d636: Waiting
c184ca102ae9: Waiting
8bf42db0de72: Waiting
d9f705c4b2c9: Pushed
212bb41a9a5b: Pushed
7f8966cf78fa: Pushed
658f4c3ab6dd: Pushed
b8f45c3b1d13: Pushed
878ce1a685b1: Pushed
c184ca102ae9: Pushed
da4ad5cd1a53: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
b14acabcfd09: Pushed
26a504e63be4: Layer already exists
4439fb655b0f: Pushed
79eca828d636: Pushed
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
6f9b4de1510e: Pushed
20211228124334: digest: sha256:1ba1d5f7db2351edc08f8180258d916daa3a65f6a21de03a25e5277e0ae63dca size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 28, 2021 12:45:19 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 28, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 28, 2021 12:45:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 28, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 28, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 040ff368759e4279492ca9a43a3c3796957af8dbf72392b4a3e1fbf8fca95824> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BA_zaHWeQnlJLKmkOjw3lpV6-Nv3I5K0o-H7-PypWCQ.pb
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 28, 2021 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326]
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 28, 2021 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ad1caa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b6b3572, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65d8dff8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444f44c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@303f1234, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24d61e4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2149594a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0]
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 28, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-28_04_45_25-2016289845667715822?project=apache-beam-testing
Dec 28, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-28_04_45_25-2016289845667715822
Dec 28, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-28_04_45_25-2016289845667715822
Dec 28, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-28T12:45:31.111Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-x0om. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 28, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:34.994Z: Worker configuration: e2-standard-2 in us-central1-f.
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:35.767Z: Expanding SplittableParDo operations into optimizable parts.
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:35.799Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:35.898Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:35.999Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.025Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.078Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.177Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.216Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.253Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.280Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.307Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.333Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.358Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.382Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.422Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.455Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.485Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.514Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.545Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.579Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.611Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.650Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.678Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.705Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.743Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.766Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.797Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.824Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:36.854Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 28, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:45:37.190Z: Starting 5 ****s in us-central1-f...
Dec 28, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:46:03.533Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 28, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:46:22.959Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 28, 2021 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:47:17.701Z: Workers have started successfully.
Dec 28, 2021 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T12:47:17.725Z: Workers have started successfully.
Dec 28, 2021 3:03:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-28T15:03:38.140Z: Staged package checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar' is inaccessible.
Dec 28, 2021 3:03:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-28T15:03:40.269Z: Staged package opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar' is inaccessible.
Dec 28, 2021 3:03:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-28T15:03:40.802Z: Staged package zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar' is inaccessible.
Dec 28, 2021 3:03:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-28T15:03:40.825Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 28, 2021 3:06:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-28T15:06:40.710Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 28, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T16:00:33.390Z: Cancel request is committed for workflow job: 2021-12-28_04_45_25-2016289845667715822.
Dec 28, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T16:00:33.439Z: Cleaning up.
Dec 28, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T16:00:33.522Z: Stopping **** pool...
Dec 28, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T16:00:33.606Z: Stopping **** pool...
Dec 28, 2021 4:03:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T16:03:03.745Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 28, 2021 4:03:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-28T16:03:03.778Z: Worker pool stopped.
Dec 28, 2021 4:03:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-28_04_45_25-2016289845667715822 finished with status CANCELLED.
Load test results for test (ID): 848dbf29-1cf7-41da-ba3f-d12511425051 and timestamp: 2021-12-28T12:45:19.917000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11562.848
dataflow_v2_java11_total_bytes_count             2.31544738E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211228124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ba1d5f7db2351edc08f8180258d916daa3a65f6a21de03a25e5277e0ae63dca
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211228124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ba1d5f7db2351edc08f8180258d916daa3a65f6a21de03a25e5277e0ae63dca]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211228124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ba1d5f7db2351edc08f8180258d916daa3a65f6a21de03a25e5277e0ae63dca])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ba1d5f7db2351edc08f8180258d916daa3a65f6a21de03a25e5277e0ae63dca
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ba1d5f7db2351edc08f8180258d916daa3a65f6a21de03a25e5277e0ae63dca
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ba1d5f7db2351edc08f8180258d916daa3a65f6a21de03a25e5277e0ae63dca].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 54s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/aygfhxg5bb6ui

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #193

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/193/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16285 from [BEAM-13492][Playground]  Update backend


------------------------------------------
[...truncated 60.32 KB...]
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:688
Dec 27, 2021 3:04:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:04:35.678Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:04:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:04:35.818Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:04:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:04:38.028Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:04:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:04:38.093Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:04:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:04:39.013Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:04:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:04:39.060Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:04:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:04:39.159Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:04:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:04:39.231Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:07:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:07:38.416Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:10:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:10:35.671Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:10:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:10:35.724Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:10:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:10:37.765Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:10:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:10:37.841Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:10:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:10:38.704Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:10:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:10:38.744Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:10:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:10:38.849Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:10:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:10:38.891Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:13:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:13:38.675Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:16:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:16:35.576Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:16:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:16:35.617Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:16:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:16:37.623Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:16:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:16:37.697Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:16:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:16:38.684Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:16:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:16:38.741Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:16:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:16:38.828Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:16:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:16:38.869Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:19:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:19:38.685Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:22:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:22:35.739Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:22:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:22:35.822Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:22:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:22:38.204Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:22:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:22:38.255Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:22:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:22:39.386Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:22:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:22:39.421Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:22:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:22:39.511Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:22:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:22:39.543Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:25:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:25:38.844Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:28:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:28:35.455Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:28:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:28:35.497Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:28:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:28:37.484Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:28:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:28:37.555Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:28:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:28:38.537Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:28:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:28:38.573Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:28:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:28:38.643Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:28:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:28:38.681Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:31:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:31:39.398Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:34:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:34:35.462Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:34:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:34:35.491Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:34:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:34:37.465Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:34:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:34:37.516Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:34:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:34:38.363Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:34:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:34:38.400Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:34:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:34:38.494Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:34:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:34:38.529Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:37:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:37:38.771Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:40:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:40:35.478Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:40:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:40:35.519Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:40:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:40:37.361Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:40:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:40:37.430Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:40:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:40:38.306Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:40:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:40:38.352Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:40:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:40:38.438Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:40:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:40:38.466Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:43:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:43:38.775Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:46:35.621Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:46:35.674Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:46:37.934Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:46:38.001Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:46:38.928Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:46:38.967Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:46:39.057Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:46:39.166Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:49:39.128Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:52:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:52:35.541Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:52:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:52:35.573Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:52:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:52:37.493Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:52:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:52:37.559Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:52:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:52:38.361Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:52:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:52:38.410Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:52:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:52:38.489Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:52:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:52:38.551Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:55:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:55:38.262Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 3:58:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:58:35.562Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Dec 27, 2021 3:58:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:58:35.614Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Dec 27, 2021 3:58:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:58:37.639Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Dec 27, 2021 3:58:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:58:37.693Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Dec 27, 2021 3:58:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:58:38.511Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Dec 27, 2021 3:58:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:58:38.545Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Dec 27, 2021 3:58:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-27T15:58:38.646Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Dec 27, 2021 3:58:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-27T15:58:38.694Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 27, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-27T16:00:35.833Z: Cancel request is committed for workflow job: 2021-12-27_04_46_14-6106271570807984149.
Dec 27, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-27T16:00:35.867Z: Cleaning up.
Dec 27, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-27T16:00:35.966Z: Stopping **** pool...
Dec 27, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-27T16:00:36.078Z: Stopping **** pool...
Dec 27, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-27T16:02:59.940Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 27, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-27T16:02:59.979Z: Worker pool stopped.
Dec 27, 2021 4:03:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-27_04_46_14-6106271570807984149 finished with status CANCELLED.
Load test results for test (ID): 20b7f72c-923e-4d42-a367-f2ac1a1736c1 and timestamp: 2021-12-27T12:46:08.815000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11496.929
dataflow_v2_java11_total_bytes_count             2.32537889E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211227124415
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:12629fa35f20b9dc8a82682aa835377d27188fdced8f10d760bce57ac4fd4a4d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211227124415]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:12629fa35f20b9dc8a82682aa835377d27188fdced8f10d760bce57ac4fd4a4d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211227124415] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:12629fa35f20b9dc8a82682aa835377d27188fdced8f10d760bce57ac4fd4a4d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:12629fa35f20b9dc8a82682aa835377d27188fdced8f10d760bce57ac4fd4a4d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:12629fa35f20b9dc8a82682aa835377d27188fdced8f10d760bce57ac4fd4a4d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:12629fa35f20b9dc8a82682aa835377d27188fdced8f10d760bce57ac4fd4a4d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 9s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/vyuqn4royskj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #192

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/192/display/redirect?page=changes>

Changes:

[alexander.zhuravlev] [BEAM-13530] Fixed scrollbar in the Output section

[noreply] Merge pull request #16355 from [BEAM-13529] [Playground] [Bugfix] Fix

[noreply] Fixed grammar error in Go ParDo function comments. (#16329)


------------------------------------------
[...truncated 51.43 KB...]
Dec 26, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 26, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 26, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 26, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 26, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 26, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 26, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 26, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-26_04_45_34-13218376411721923737?project=apache-beam-testing
Dec 26, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-26_04_45_34-13218376411721923737
Dec 26, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-26_04_45_34-13218376411721923737
Dec 26, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T12:45:42.044Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-bfhv. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:50.252Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:50.946Z: Expanding SplittableParDo operations into optimizable parts.
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:50.976Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.041Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.109Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.131Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.189Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.299Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.330Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.368Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.438Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.477Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.507Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.541Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.573Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.607Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.649Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.684Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.731Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.762Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.796Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.831Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.864Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.910Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.943Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:51.980Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:52.015Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:52.048Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:52.078Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:52.122Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 26, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:52.534Z: Starting 5 ****s in us-central1-a...
Dec 26, 2021 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:45:59.495Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 26, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:46:36.659Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 26, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:47:32.761Z: Workers have started successfully.
Dec 26, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T12:47:32.781Z: Workers have started successfully.
Dec 26, 2021 1:15:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:15:53.947Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:15:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:15:55.668Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:15:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:15:55.998Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:15:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:15:56.654Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:18:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:18:56.096Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:21:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:21:53.873Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:21:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:21:55.537Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:21:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:21:55.806Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:21:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:21:56.402Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:24:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:24:56.202Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:27:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:27:53.875Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:27:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:27:55.592Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:27:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:27:55.822Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:27:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:27:56.453Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:30:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:30:56.257Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:33:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:33:53.851Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:33:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:33:55.452Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:33:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:33:55.704Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:33:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:33:56.295Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:36:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:36:56.110Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:39:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:39:53.704Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:39:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:39:55.359Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:39:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:39:55.622Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:39:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:39:56.208Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:42:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:42:56.567Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:45:53.809Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:45:55.346Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:45:55.593Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:45:56.244Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:48:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:48:56.238Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:51:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:51:53.783Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:51:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:51:55.505Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:51:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:51:55.744Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:51:56.328Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:54:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:54:57.480Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 1:57:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:57:53.806Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 1:57:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:57:55.444Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 1:57:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T13:57:55.703Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 1:57:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T13:57:56.377Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 2:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T14:00:56.228Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 2:03:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T14:03:53.732Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Dec 26, 2021 2:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T14:03:55.383Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Dec 26, 2021 2:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-26T14:03:55.661Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Dec 26, 2021 2:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-26T14:03:56.315Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T16:00:32.862Z: Cancel request is committed for workflow job: 2021-12-26_04_45_34-13218376411721923737.
Dec 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T16:00:32.896Z: Cleaning up.
Dec 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T16:00:32.970Z: Stopping **** pool...
Dec 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T16:00:33.024Z: Stopping **** pool...
Dec 26, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T16:02:58.158Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 26, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-26T16:02:58.196Z: Worker pool stopped.
Dec 26, 2021 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-26_04_45_34-13218376411721923737 finished with status CANCELLED.
Load test results for test (ID): 3518d724-bf76-41d3-9f3a-9e62f9af6b5a and timestamp: 2021-12-26T12:45:29.709000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11541.274
dataflow_v2_java11_total_bytes_count             1.75150109E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211226124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2baa66d4b8dbce58c28f0f56cb1cfee194c29b246c3e273e1761ec87756fb90
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211226124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2baa66d4b8dbce58c28f0f56cb1cfee194c29b246c3e273e1761ec87756fb90]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211226124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2baa66d4b8dbce58c28f0f56cb1cfee194c29b246c3e273e1761ec87756fb90])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2baa66d4b8dbce58c28f0f56cb1cfee194c29b246c3e273e1761ec87756fb90
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2baa66d4b8dbce58c28f0f56cb1cfee194c29b246c3e273e1761ec87756fb90
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b2baa66d4b8dbce58c28f0f56cb1cfee194c29b246c3e273e1761ec87756fb90].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 50s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7rumlkvvskpri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #191

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/191/display/redirect>

Changes:


------------------------------------------
[...truncated 45.89 KB...]
Dec 25, 2021 12:47:11 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 25, 2021 12:47:12 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 25, 2021 12:47:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 25, 2021 12:47:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 25, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 25, 2021 12:47:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 25, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 054409705b0d8499190adcbf08f46b4657022bdd9f6a91df456888cf64144c8e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BUQJcFsNhJkZCty_CPRrRlcCK92fapHfRWiIz2QUTI4.pb
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 25, 2021 12:47:18 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326]
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 25, 2021 12:47:18 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ad1caa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b6b3572, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65d8dff8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444f44c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@303f1234, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24d61e4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2149594a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0]
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 25, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 25, 2021 12:47:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-25_04_47_18-16676523383915913724?project=apache-beam-testing
Dec 25, 2021 12:47:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-25_04_47_18-16676523383915913724
Dec 25, 2021 12:47:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-25_04_47_18-16676523383915913724
Dec 25, 2021 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-25T12:47:26.601Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-66l5. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:31.208Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:31.941Z: Expanding SplittableParDo operations into optimizable parts.
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:31.971Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.026Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.074Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.094Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.136Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.213Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.235Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.257Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.282Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.304Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.331Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.358Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.378Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.398Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.419Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.448Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.490Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.511Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.532Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.559Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.580Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.602Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.639Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.664Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.691Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.730Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.772Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 25, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:32.800Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 25, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:33.205Z: Starting 5 ****s in us-central1-a...
Dec 25, 2021 12:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:47:57.842Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 25, 2021 12:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:48:21.777Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 25, 2021 12:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:49:17.842Z: Workers have started successfully.
Dec 25, 2021 12:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T12:49:17.869Z: Workers have started successfully.
Dec 25, 2021 2:41:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:33.323Z: Staged package amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar' is inaccessible.
Dec 25, 2021 2:41:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:33.603Z: Staged package aws-java-sdk-cloudwatch-1.12.106-kUP1xZSNspLnoAnC0aAzibk9w0IXNbBNeMBESBYiZtg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-cloudwatch-1.12.106-kUP1xZSNspLnoAnC0aAzibk9w0IXNbBNeMBESBYiZtg.jar' is inaccessible.
Dec 25, 2021 2:41:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:33.705Z: Staged package aws-java-sdk-core-1.12.106-2a5g2j72YwhD9uNhLJWqRljsgds1wEVsdH_XLgZZOlQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-core-1.12.106-2a5g2j72YwhD9uNhLJWqRljsgds1wEVsdH_XLgZZOlQ.jar' is inaccessible.
Dec 25, 2021 2:41:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:33.740Z: Staged package aws-java-sdk-dynamodb-1.12.106-s8z13aUxEWhNzx7J8mY6rVcDrK2iLAcBqT3Pb9sNgmM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-dynamodb-1.12.106-s8z13aUxEWhNzx7J8mY6rVcDrK2iLAcBqT3Pb9sNgmM.jar' is inaccessible.
Dec 25, 2021 2:41:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:33.836Z: Staged package aws-java-sdk-kinesis-1.12.106-wMPghgT8YvcVokv233DC2r8Q0eeaqh7phw3Anmz6S4o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kinesis-1.12.106-wMPghgT8YvcVokv233DC2r8Q0eeaqh7phw3Anmz6S4o.jar' is inaccessible.
Dec 25, 2021 2:41:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:33.873Z: Staged package aws-java-sdk-kms-1.12.106-JkHOiVwKCYdQ3d-wR7YYej-O0UmkZOioL923IYI9PwM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kms-1.12.106-JkHOiVwKCYdQ3d-wR7YYej-O0UmkZOioL923IYI9PwM.jar' is inaccessible.
Dec 25, 2021 2:41:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:33.913Z: Staged package aws-java-sdk-s3-1.12.106-5YV0cnyV551lh5cwCZGFhypcyG7NsqTcEP0IBjaAnjE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-s3-1.12.106-5YV0cnyV551lh5cwCZGFhypcyG7NsqTcEP0IBjaAnjE.jar' is inaccessible.
Dec 25, 2021 2:41:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:36.381Z: Staged package jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar' is inaccessible.
Dec 25, 2021 2:41:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:36.416Z: Staged package jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar' is inaccessible.
Dec 25, 2021 2:41:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-25T14:41:36.470Z: Staged package jmespath-java-1.12.106-_mWkV0KPXB56ULuR7j6dxTg6hrDhcycHHtCXYBv9V7s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jmespath-java-1.12.106-_mWkV0KPXB56ULuR7j6dxTg6hrDhcycHHtCXYBv9V7s.jar' is inaccessible.
Dec 25, 2021 2:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-25T14:41:37.466Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 25, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T16:00:36.636Z: Cancel request is committed for workflow job: 2021-12-25_04_47_18-16676523383915913724.
Dec 25, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T16:00:36.684Z: Cleaning up.
Dec 25, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T16:00:36.786Z: Stopping **** pool...
Dec 25, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T16:00:36.841Z: Stopping **** pool...
Dec 25, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T16:02:57.215Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 25, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-25T16:02:57.250Z: Worker pool stopped.
Dec 25, 2021 4:03:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-25_04_47_18-16676523383915913724 finished with status CANCELLED.
Load test results for test (ID): 9bad4cbc-5eda-45ba-a17a-0790d13957c7 and timestamp: 2021-12-25T12:47:12.394000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11449.001
dataflow_v2_java11_total_bytes_count              2.0612342E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211225124342
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bebad7d86c58d54eed89480d05635972088624443b0cddccd6b50863989fb7a9
Deleted: sha256:df760ff267a791332f26d3cca3d011cba2f6cbfefd306cf517016103dd214741
Deleted: sha256:070608c4828ff34e17dd48ddf67ad230aeb5b51c2cde4e3c6c6122844fd41e65
Deleted: sha256:8d45bdd9a5a589f580cdbdde3a55663d482578d230769b2ad64b6b1d892cc2fd
Deleted: sha256:a6c5dc11c2c8d70bfc11f6b514bfc24b49dc4aba15cc716c06c1458ae1c8adb2
Deleted: sha256:830b5ace3970a11d94ee8a48004739a1942bda5aaaa0424e57598929bf9b3a4e
Deleted: sha256:1d1f4c98e49ba1f1e6aefdf8ddcfc4878c70fd6ccbe1461b945da9ff3193fef0
Deleted: sha256:0538cb8917693fc9f96e483550da7c8e2f5706a607fb6bc7f412eb5e375abe58
Deleted: sha256:66d26ec89e927225195d70eb322c605f585566bfd9e567f33cb1e5ea3744fe63
Deleted: sha256:371ae7a9cd4aae2a012ef5b22da5182d7f47e9589a663eba9f9b248bf7d74d0f
Deleted: sha256:8c3c7c3913ca7ebd7b50c513fd34a6c17dab494b1e2d3a4ec6bcd2169ea3c014
Deleted: sha256:257d8efe6f9fbbdcd8d4a5e7fa6472c2fe1c34ea65904f82cd29e7fd1a2e5ab8
Deleted: sha256:e3306b768c98a6a07e3cc238d16f252ddb8b887788456fcef54c811e0fc96eac
Deleted: sha256:0f29038e6b54989d1884dd91d755debd1252d0aa8b003f30ab688a71f95ded15
Deleted: sha256:b9318c39d70d714fa9b558511963d1d2200ab9233edf1e1d4d7aaedfc4572f05
Deleted: sha256:9e944d3449079e49245513dc9894844746295925c594d7bd12cff5f780dabacc
Deleted: sha256:88f0ea721cd3d12376cd7e39b30c4bf3749cee26e69301237399965898249eac
Deleted: sha256:efd8392f2c062eeb7dc2ce8baddf9ab0c0b977cb16b8017fcbf98b06421ebf50
Deleted: sha256:d9d8655b13ffa3b36e5e58a26091e550ce8808bd2868c651faafcadc5caf8fee
Deleted: sha256:7d7148704a86585a414759b2b46d11e2f5943d31424bf644474c56516cbc456a
Deleted: sha256:c3965f67e0a689488973e51aff064397da86ab42c7176471e7ac63281083c9e3
Deleted: sha256:e86d7e4e7c74f89e96e84006920a17d2319ca0765b23b459f849301aec421cc8
Deleted: sha256:36701e3a850164f861027992e1a9123d7d5c820e41472ddc5106c491879efce2
Deleted: sha256:def3b980082495ef69a81c1365b61bfad6bfa0e4497a4dd83750a9e4362a9fcd
Deleted: sha256:2f27fc4f1f6f33ba903d089bc6f3931432fcaa8670ec0fe7cb8744d5521467f7
Deleted: sha256:26f8987d90ad5be4b639442ebd2ddb3806f2544fca951d8202937ae2c7247e08
Deleted: sha256:ddd23614e2823a4188acb6956eac18e21e671703c840b46850f496a38d83b08d
Deleted: sha256:c112f00a596d88f0be1bcdd49b435ed995887c669934c5d18a5fadb76d867c33
Deleted: sha256:b0c075de5ecf3d7f4985951015693d71600d4f91073f63494bfe83d3e9e2e64e
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211225124342]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bebad7d86c58d54eed89480d05635972088624443b0cddccd6b50863989fb7a9]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211225124342] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bebad7d86c58d54eed89480d05635972088624443b0cddccd6b50863989fb7a9])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bebad7d86c58d54eed89480d05635972088624443b0cddccd6b50863989fb7a9
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bebad7d86c58d54eed89480d05635972088624443b0cddccd6b50863989fb7a9
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bebad7d86c58d54eed89480d05635972088624443b0cddccd6b50863989fb7a9].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 46s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/oiigzphgwna52

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #190

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/190/display/redirect?page=changes>

Changes:

[alexander.zhuravlev] [BEAM-13489] Updated manifest.json and index.html to follow best

[stranniknm] [BEAM-13531] playground - fetch example info only once

[mmack] [BEAM-7559] Add integration test for DynamoDBIO (SDK v2)

[mmack] [BEAM-7559] Add integration test for DynamoDBIO (SDK v1)

[noreply] [BEAM-13399] Disable expansion service integration test on Samza and

[noreply] [BEAM-13430] Upgrade beam to gradle 7.3.2 (#16319)

[noreply] Merge pull request #16320 from [BEAM-13512] [Playground] send request to

[noreply] Merge pull request #16209 from [Playground][BEAM-13304]Deploy Python

[noreply] Merge pull request #16105 from [BEAM-13328] [Playground] Add

[noreply] Merge pull request #16103 from [BEAM-13252] [Playground] Feedback Thumbs

[noreply] Better type inference for GroupBy. (#16318)

[noreply] [BEAM-13534] Update TestAutomatedExpansionService liveness check to not

[noreply] [BEAM-13490] Bump log4j version to 2.17.0 (#16276)


------------------------------------------
[...truncated 44.47 KB...]
39d32f4aa32e: Pushed
54a608421806: Pushed
043b179ca7a1: Pushed
415c559b919d: Pushed
38f25d46b52e: Pushed
889918499d98: Pushed
3bb5258f46d2: Layer already exists
03473aea7665: Pushed
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
219e8947999a: Pushed
3a8cb18318dd: Pushed
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
67e69396e5de: Pushed
20211224124334: digest: sha256:819138cae5fec4c77812ba3c01567df8ebad86f3ed077f70d2f5a15586519627 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 24, 2021 12:45:31 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 24, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 24, 2021 12:45:32 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 24, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 24, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 24, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 24, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 24, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 6ffbd572b291033ca9c7100a9da088726ae01fdb3d4a956feecfd57f453307db> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-b_vVcrKRAzypxxAKnaCIcmrgH9s9SpVv7s_Vf0UzB9s.pb
Dec 24, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 24, 2021 12:45:37 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bc33720, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dd0f797]
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 24, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b6b3572, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65d8dff8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444f44c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@303f1234, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24d61e4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2149594a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@506a1372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7332a7]
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 24, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 24, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-24_04_45_38-16419312113343538639?project=apache-beam-testing
Dec 24, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-24_04_45_38-16419312113343538639
Dec 24, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-24_04_45_38-16419312113343538639
Dec 24, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-24T12:45:45.343Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-jnjx. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 24, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:49.403Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 24, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:49.944Z: Expanding SplittableParDo operations into optimizable parts.
Dec 24, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.000Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.058Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.135Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.178Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.232Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.343Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.378Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.411Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.447Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.470Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.494Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.518Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.554Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.585Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.616Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.643Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.673Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.707Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.743Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.766Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.800Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.836Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.873Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.906Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.931Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.960Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:50.993Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:51.027Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 24, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:45:51.420Z: Starting 5 ****s in us-central1-a...
Dec 24, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:46:05.233Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 24, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:46:31.576Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 24, 2021 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:47:29.089Z: Workers have started successfully.
Dec 24, 2021 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T12:47:29.120Z: Workers have started successfully.
Dec 24, 2021 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T16:00:38.715Z: Cancel request is committed for workflow job: 2021-12-24_04_45_38-16419312113343538639.
Dec 24, 2021 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T16:00:38.782Z: Cleaning up.
Dec 24, 2021 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T16:00:38.908Z: Stopping **** pool...
Dec 24, 2021 4:00:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T16:00:38.973Z: Stopping **** pool...
Dec 24, 2021 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T16:03:07.648Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 24, 2021 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-24T16:03:07.689Z: Worker pool stopped.
Dec 24, 2021 4:03:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-24_04_45_38-16419312113343538639 finished with status CANCELLED.
Load test results for test (ID): b562686d-158d-489f-8bab-14a308559aac and timestamp: 2021-12-24T12:45:32.441000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11506.377
dataflow_v2_java11_total_bytes_count              2.2251691E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211224124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:819138cae5fec4c77812ba3c01567df8ebad86f3ed077f70d2f5a15586519627
Deleted: sha256:15a2bb3d9408cc97712ae8de570b63a578de82d149e3317e919e3d2e0ca12d54
Deleted: sha256:eb39734b6a7e4fce7ea1b6a9cd2f34f685c4e04a000c71a2cefc9917c2070669
Deleted: sha256:12eff510f376a49bf2efdf03127c037fbe4a8b2718b9a8510481c3214ca3b41e
Deleted: sha256:322dcc2f6b3384c4ac5670ab1b7a80c2c9c14481ea1d6ec804a64dd13a3ab014
Deleted: sha256:2671a503428579d20b59d5a644368be52bfc0455f6a0f579bb64fe81c2af531f
Deleted: sha256:d1deaecc5b580eef71bf029520c4fbcf7061e31be7b9321ef6161c75ac4df0c5
Deleted: sha256:7121bd6bc762618d48e9b29beddc0b61662547af4cbe4a808848695da9a1a27f
Deleted: sha256:1b993cddde22fd1020da220202da2b8acdd9c9a0f4d430fa79cceb037210d1cb
Deleted: sha256:b3d37efd95aa3f65e65752c0ec6b26a431b229562993622143a2929afb9868fb
Deleted: sha256:23f10af8050856eb83998d1c79ce25c00c53778a31601e1493eaf0d3304a3cbb
Deleted: sha256:27a87f950e8108111850295b14a0555df6b92daf1b0c504f7314154b8771c4e1
Deleted: sha256:94f1eb787a2c830bba6de32ad940765c9fde394604312696b90738750e4b4ddb
Deleted: sha256:c184e0d7eb6820efe3c16ac4187c976174e1b0cac171288a4a5fea09d60c6970
Deleted: sha256:36a8a2c1f198fa9d34e624cd2b6a7d2adc349b5a9b200b47e1be4933102f0be6
Deleted: sha256:dfd8015243bac8453f2c50200ca7f1c55a2a1ade48fff4b8359e06bdd719e876
Deleted: sha256:ee7b90a83ebd04b49f9e94e6dacd4efc229676297776c710a9cc7e0a982972ba
Deleted: sha256:9eb0263b3da2438eefe8e7aafc5c0ed44c8a97749c096e7bbb24d03ff615f1fe
Deleted: sha256:30d4b29a3bca342ad288ee996a1b12908d710c9bf58a7dde35d2d51abed3da8a
Deleted: sha256:0e7ea0a07f55e1c70ce9190c85c3234fb22e0784f05f00373d5d6c5c0bd29b41
Deleted: sha256:8f393789752cc312f5c12efdc8684ba91c1d1006ae4a9690259075cd1a911b29
Deleted: sha256:29841c73e094a8bc1baaf1858661e85179365182288c7bfff601ad255b797900
Deleted: sha256:557599c07cc85404825d715e2a0f8ab7d61dee5429951826cdd83993510ff371
Deleted: sha256:ac224a0445fad0908c87edf0a49d1b09aa5ef7275c869277ce45b1a6e06200b5
Deleted: sha256:944ed90d1a1bddecfae0dd79863d6388f1197aafd99c1741920f48b23d8f7f25
Deleted: sha256:5aa42f150e66458c6f44c5f551c954e4a186f70eaa6383a528f7a19207095db2
Deleted: sha256:bb03c1f7b4174733b911df01f0e7fcade678ef1159ada3a2b2663c856a5d200b
Deleted: sha256:11837a38fe967820dbbcb78be21b8469f187d8cba9a16fbe721f92ef6848983b
Deleted: sha256:e7da07073c98a02b91efe12ebd1daca1e14d2ba3eab8baa6024b212900cef2f7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211224124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:819138cae5fec4c77812ba3c01567df8ebad86f3ed077f70d2f5a15586519627]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211224124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:819138cae5fec4c77812ba3c01567df8ebad86f3ed077f70d2f5a15586519627])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:819138cae5fec4c77812ba3c01567df8ebad86f3ed077f70d2f5a15586519627
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:819138cae5fec4c77812ba3c01567df8ebad86f3ed077f70d2f5a15586519627
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:819138cae5fec4c77812ba3c01567df8ebad86f3ed077f70d2f5a15586519627].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 19m 57s
89 actionable tasks: 52 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/o7delkueyonba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #189

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/189/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13494] Avoid draining Kinesis API limits & CPU when reading from

[arietis27] [BEAM-13517] Unable to write nulls to columns with logical types

[stranniknm] [BEAM-13501] playground - add log about processing and precompiled run

[noreply] Make S3 streaming more efficient (#15931)

[noreply] [Playground][BEAM-13358]Deploy Go service via Ci/CD (#16210)

[dpcollins] [BEAM-13430] Clean up tests that override the time of the JVM

[noreply] [BEAM-12830] Install go version w/host platform. (#16330)

[noreply] [BEAM-13399] Add extra time buffer to reduce likelihood of flake in

[noreply] [BEAM-12865] Fix triggering_frequency validation (#16315)

[noreply] [BEAM-13525] Sickbay new PardoTests for Dataflow V2 Java Streaming VR

[noreply] [BEAM-12830] Avoid flock and prepare task once. (OSX fix.) (#16333)

[noreply] Bump Python beam-master containers (#16332)

[arietis27] [BEAM-13517] Unable to write nulls to columns with logical types


------------------------------------------
[...truncated 43.93 KB...]
0da929f5ccc4: Pushed
39ef6f16b40e: Pushed
c3e5d2b29dd5: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
5867bd9c236b: Pushed
f9e18e59a565: Layer already exists
7bec1ec86199: Pushed
26a504e63be4: Layer already exists
cc49bcd0e7dd: Pushed
31892cc314cb: Layer already exists
8bf42db0de72: Layer already exists
11936051f93b: Layer already exists
dcc1c0a65110: Pushed
20211223124335: digest: sha256:c80958524a8555af812cc96f92bca50aba836ec2a12a934b4f0c5698bba2c51d size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 23, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 23, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 23, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 23, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 80b377990a10555002a96335f0a942f70a7bbedfe247a957bde66500a5481bff> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-gLN3mQoQVVACqWM18KlC9wp7vt_iR6lXveZlAKVIG_8.pb
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 23, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326]
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 23, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ad1caa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b6b3572, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65d8dff8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444f44c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@303f1234, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24d61e4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2149594a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0]
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 23, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 23, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-23_04_45_28-14917756070590393734?project=apache-beam-testing
Dec 23, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-23_04_45_28-14917756070590393734
Dec 23, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-23_04_45_28-14917756070590393734
Dec 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-23T12:45:36.149Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-nit4. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:40.866Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:41.512Z: Expanding SplittableParDo operations into optimizable parts.
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:41.552Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:41.619Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:41.711Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:41.743Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:41.846Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:41.955Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.000Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.034Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.099Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.129Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.160Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.269Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.306Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.331Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.355Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.380Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.412Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.441Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.497Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.530Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.562Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.601Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.636Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.670Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.698Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.845Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.900Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 23, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:42.937Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 23, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:45:43.457Z: Starting 5 ****s in us-central1-a...
Dec 23, 2021 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:46:11.521Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 23, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:46:33.612Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 23, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:47:29.112Z: Workers have started successfully.
Dec 23, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T12:47:29.150Z: Workers have started successfully.
Dec 23, 2021 2:30:57 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Dec 23, 2021 3:54:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:44.735Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Dec 23, 2021 3:54:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:44.929Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:46.911Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:46.950Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.004Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.106Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.151Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.222Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.263Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Dec 23, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.311Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Dec 23, 2021 3:54:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.353Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Dec 23, 2021 3:54:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.419Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Dec 23, 2021 3:54:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.480Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Dec 23, 2021 3:54:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.582Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Dec 23, 2021 3:54:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:47.630Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Dec 23, 2021 3:54:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-23T15:54:48.075Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Dec 23, 2021 3:54:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-23T15:54:48.208Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 23, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-23T15:57:47.907Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 23, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T16:00:34.026Z: Cancel request is committed for workflow job: 2021-12-23_04_45_28-14917756070590393734.
Dec 23, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T16:00:34.068Z: Cleaning up.
Dec 23, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T16:00:34.167Z: Stopping **** pool...
Dec 23, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T16:00:34.229Z: Stopping **** pool...
Dec 23, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T16:02:52.348Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 23, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-23T16:02:52.396Z: Worker pool stopped.
Dec 23, 2021 4:02:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-23_04_45_28-14917756070590393734 finished with status CANCELLED.
Load test results for test (ID): 6ad7f11c-0546-4785-a691-76f2243e8cf8 and timestamp: 2021-12-23T12:45:23.551000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11553.885
dataflow_v2_java11_total_bytes_count              2.6876774E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c80958524a8555af812cc96f92bca50aba836ec2a12a934b4f0c5698bba2c51d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c80958524a8555af812cc96f92bca50aba836ec2a12a934b4f0c5698bba2c51d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c80958524a8555af812cc96f92bca50aba836ec2a12a934b4f0c5698bba2c51d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c80958524a8555af812cc96f92bca50aba836ec2a12a934b4f0c5698bba2c51d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c80958524a8555af812cc96f92bca50aba836ec2a12a934b4f0c5698bba2c51d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c80958524a8555af812cc96f92bca50aba836ec2a12a934b4f0c5698bba2c51d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 41s
79 actionable tasks: 49 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/46awwn3jep3ve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #188

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/188/display/redirect?page=changes>

Changes:

[rsl4] [BEAM-13397] Bump numpy to 1.21 for M1 chip

[rsl4] re-generate requirements.txt

[Kyle Weaver] [BEAM-13498] Sickbay portable Flink testProcessElementSkew.

[stranniknm] [BEAM-13502]: fix loading example on embedded version

[noreply] [BEAM-13430] Start the process of upgrading the Gradle 7. (#16292)

[noreply] [BEAM-13430] Introduce new testRuntimeMigration configuration that

[noreply] Update flink cluster to use a supported dataproc version (1.2 -> 1.5)

[Robert Bradshaw] Add microbenchmark for row coder.

[noreply] [BEAM-13499] Add warning about hcatalog to release notes and javadoc

[Robert Bradshaw] Compile RowCoderImpl with Cython.

[Robert Bradshaw] RowCoder optimizations.

[Robert Bradshaw] Further optimizations for common case of components in order.

[noreply] [BEAM-13430] Remove propdeps and replace with compileOnly (#16308)

[Robert Bradshaw] Optimize rows with null fields.

[noreply] Merge pull request #16304 from [BEAM-13491] [Playground] Examples'

[noreply] Merge pull request #16283 from [BEAM-13448] [Playground] track run code

[noreply] Merge pull request #16244 from [BEAM-13463] [Playground] add retries to

[noreply] Merge pull request #16241 from [BEAM-13440] [Playground] Implement

[noreply] [BEAM-13399] Add integration test for Go SDK expansion service JAR

[noreply] [BEAM-13421] Fix bug with xs called with non-tuple key (#16258)

[Robert Bradshaw] RowCoder lint.


------------------------------------------
[...truncated 41.55 KB...]
88fc27550592: Preparing
fff952342f20: Preparing
7adb2b7f3536: Preparing
cca1320cf2c3: Preparing
06f21a01682d: Preparing
6ed5b4f6ee85: Preparing
3bb5258f46d2: Preparing
832e177bb500: Preparing
f9e18e59a565: Preparing
26a504e63be4: Preparing
8bf42db0de72: Preparing
7adb2b7f3536: Waiting
31892cc314cb: Preparing
cca1320cf2c3: Waiting
11936051f93b: Preparing
8bf42db0de72: Waiting
fff952342f20: Waiting
11936051f93b: Waiting
31892cc314cb: Waiting
26a504e63be4: Waiting
fc48acd2c1e4: Waiting
88fc27550592: Waiting
3bb5258f46d2: Waiting
06f21a01682d: Waiting
6ed5b4f6ee85: Waiting
832e177bb500: Waiting
f9e18e59a565: Waiting
c78108382bb2: Pushed
45d5eb91a523: Pushed
006560ebe63d: Pushed
fc48acd2c1e4: Pushed
fff952342f20: Pushed
c11675e674af: Pushed
38ae2c2a20be: Pushed
7adb2b7f3536: Pushed
3bb5258f46d2: Layer already exists
832e177bb500: Layer already exists
f9e18e59a565: Layer already exists
26a504e63be4: Layer already exists
8bf42db0de72: Layer already exists
31892cc314cb: Layer already exists
11936051f93b: Layer already exists
06f21a01682d: Pushed
6ed5b4f6ee85: Pushed
88fc27550592: Pushed
cca1320cf2c3: Pushed
20211222124332: digest: sha256:d898f6335789a20ed87e2f7353981dd4fff41d7f0f407cf54300b81d80f0265f size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 22, 2021 12:45:24 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 22, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 22, 2021 12:45:25 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 22, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 22, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 22, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 22, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 22, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash be6997fb8519d4f87a7ec006563219d0e65688cb86e954f47353a2a2b896c808> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vmmX-4UZ1Ph6fsAGVjIZ0OZWiMuG6VT0c1OioriWyAg.pb
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 22, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7601bc96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48a0c8aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6192a5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3722c145, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cbc2e3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2975a9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@765ffb14, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57562473, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a360554, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@424de326, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bc33720, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dd0f797, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67064bdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a7fd0c9]
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 22, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444f44c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@303f1234, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24d61e4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2149594a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f1e58ca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f847af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ed34ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@553bc36c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@380e1909, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d5ef498, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@95eb320, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f521c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4afbb6c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10db6131, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c6017b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4730e0f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@506a1372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7332a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77c233af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b56ac7]
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 22, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 22, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-22_04_45_30-12883701128565853341?project=apache-beam-testing
Dec 22, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-22_04_45_30-12883701128565853341
Dec 22, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-22_04_45_30-12883701128565853341
Dec 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-22T12:45:37.892Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-jtjp. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 22, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:42.037Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:42.841Z: Expanding SplittableParDo operations into optimizable parts.
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:42.882Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:42.970Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.042Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.073Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.138Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.242Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.276Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.300Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.322Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.352Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.378Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.405Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.438Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.464Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.499Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.531Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.569Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.592Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.617Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.642Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.679Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.701Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.727Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.750Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.777Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.804Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.836Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:43.870Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:45:44.248Z: Starting 5 ****s in us-central1-a...
Dec 22, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:46:04.742Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 22, 2021 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:46:42.806Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 22, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:47:38.165Z: Workers have started successfully.
Dec 22, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T12:47:38.200Z: Workers have started successfully.
Dec 22, 2021 2:03:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-22T14:03:47.356Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Dec 22, 2021 2:03:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-22T14:03:47.491Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Dec 22, 2021 2:03:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-22T14:03:48.331Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 22, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T16:00:32.342Z: Cancel request is committed for workflow job: 2021-12-22_04_45_30-12883701128565853341.
Dec 22, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T16:00:32.384Z: Cleaning up.
Dec 22, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T16:00:32.458Z: Stopping **** pool...
Dec 22, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T16:00:32.511Z: Stopping **** pool...
Dec 22, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T16:03:01.639Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 22, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-22T16:03:01.679Z: Worker pool stopped.
Dec 22, 2021 4:03:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-22_04_45_30-12883701128565853341 finished with status CANCELLED.
Load test results for test (ID): b43f391b-5fc6-4dd1-993b-c3158d29c147 and timestamp: 2021-12-22T12:45:25.358000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11513.358
dataflow_v2_java11_total_bytes_count             2.52363928E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211222124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d898f6335789a20ed87e2f7353981dd4fff41d7f0f407cf54300b81d80f0265f
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211222124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d898f6335789a20ed87e2f7353981dd4fff41d7f0f407cf54300b81d80f0265f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211222124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d898f6335789a20ed87e2f7353981dd4fff41d7f0f407cf54300b81d80f0265f])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d898f6335789a20ed87e2f7353981dd4fff41d7f0f407cf54300b81d80f0265f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d898f6335789a20ed87e2f7353981dd4fff41d7f0f407cf54300b81d80f0265f
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d898f6335789a20ed87e2f7353981dd4fff41d7f0f407cf54300b81d80f0265f].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 51s
79 actionable tasks: 48 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5uqocsbtfisw4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #187

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/187/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-13088] Add parameter tempWriteDataset to BigQueryIO to store temp

[stranniknm] [BEAM-13466]: sort categories and examples by name

[aydar.zaynutdinov] [BEAM-13485][Playground]

[alexander.zhuravlev] [BEAM-13476] Changed Timeout Error Text

[alexander.zhuravlev] [BEAM-13474] Changed 'Playground' logo text color

[mmack] [BEAM-13443] Avoid blocking put to Kinesis records queue to shutdown

[noreply] Merge pull request #16278 from [BEAM-13479] [Playground] Change logic

[noreply] Merge pull request #16281 from [BEAM-13475] [Playground] [Bugfix] Error

[noreply] Merge pull request #16279 from [BEAM-13473] [Playground] [Bugfix] Reset

[noreply] Merge pull request #16232 from [BEAM-13285][Playground] Add steps to

[noreply] [BEAM-13399] Add check for dev versions of JARs to download code

[noreply] Merge pull request #16122 from [BEAM-13345] [Playground] add resizable

[benjamin.gonzalez] [BEAM-13088] Make tempDataset final

[noreply] Merge pull request #16170 from [BEAM-13411][Playground] Add getting of

[noreply] Merge pull request #16172 from [BEAM-13417] [Playground] Add java

[noreply] Merge pull request #16240 from [BEAM-13465][Playground] Change error

[noreply] Merge pull request #16234 from [BEAM-13461][Playground] [Bugfix] Error

[noreply] Merge pull request #15489 from [BEAM-12865] Allow custom batch_duration

[noreply] [BEAM-13483] Increase timeout of Java Examples Dataflow suite. (#16226)

[Kyle Weaver] [BEAM-13496] Upgrade Flink runner to include log4j patches.

[Kyle Weaver] [BEAM-13497] Correct class name in Flink tests.

[noreply] Pin transitive log4j dependencies to 2.17.0 in :sdks:java:io:hcatalog

[noreply] [BEAM-13434] Bump solr to 8.11.1 (#16296)

[noreply] Use a patched shadow 6.1.0 plugin using Log4j 2.16.0 (#16269)

[noreply] [BEAM-12830] Replace GoGradle plugin with Shell Scripts. (#16291)


------------------------------------------
[...truncated 46.42 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
40f0e4a6936f: Preparing
fb75e428a79a: Preparing
96bcdbe57bfa: Preparing
a2799f6bf026: Preparing
b8a1873d363d: Preparing
bd22718b7352: Preparing
195254c02ade: Preparing
03a3183a1fe6: Preparing
876cfdbf9614: Preparing
ef26e774f41b: Preparing
0ed0430dd4ec: Preparing
6fb68bd9178f: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
927f9fcef4cf: Waiting
6fb68bd9178f: Waiting
0ed0430dd4ec: Waiting
ef26e774f41b: Waiting
5c81f9330d99: Waiting
876cfdbf9614: Waiting
195254c02ade: Waiting
a81f1846a0d2: Waiting
3b441d7cb46b: Waiting
bd22718b7352: Waiting
03a3183a1fe6: Waiting
e2e8c39e0f77: Waiting
d3710de04cb3: Waiting
b8a1873d363d: Pushed
96bcdbe57bfa: Pushed
fb75e428a79a: Pushed
40f0e4a6936f: Pushed
bd22718b7352: Pushed
a2799f6bf026: Pushed
03a3183a1fe6: Pushed
876cfdbf9614: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
0ed0430dd4ec: Pushed
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
195254c02ade: Pushed
6fb68bd9178f: Pushed
ef26e774f41b: Pushed
20211221124332: digest: sha256:672d4bf1af73bfc4fa7a079394c5343c38f3e45a87dc95cfd4ab51152a93d7a8 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 21, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 21, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 21, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 21, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 21, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 21, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 94551b66a3d335eb1c3e0ca8fabb2e46bc2703bbd8d63943543bccba998f8a5e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lFUbZqPTNescPgyo-rsuRrwnA7vY1jlDVDvMupmPil4.pb
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 21, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 21, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 21, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-21_04_45_29-2661965857973093019?project=apache-beam-testing
Dec 21, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-21_04_45_29-2661965857973093019
Dec 21, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-21_04_45_29-2661965857973093019
Dec 21, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-21T12:45:38.227Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-lp8b. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:44.136Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:44.843Z: Expanding SplittableParDo operations into optimizable parts.
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:44.872Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:44.956Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.015Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.046Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.111Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.213Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.238Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.264Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.298Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.335Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.393Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.426Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.459Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.482Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.515Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.546Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.594Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.628Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.664Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.688Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.722Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.756Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.788Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.822Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 21, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.858Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 21, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.881Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 21, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.916Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 21, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:45.967Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 21, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:45:46.297Z: Starting 5 ****s in us-central1-a...
Dec 21, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:46:06.171Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 21, 2021 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:46:37.688Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 21, 2021 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:47:39.877Z: Workers have started successfully.
Dec 21, 2021 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T12:47:39.907Z: Workers have started successfully.
Dec 21, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T16:00:34.361Z: Cancel request is committed for workflow job: 2021-12-21_04_45_29-2661965857973093019.
Dec 21, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T16:00:34.413Z: Cleaning up.
Dec 21, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T16:00:34.481Z: Stopping **** pool...
Dec 21, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T16:00:34.534Z: Stopping **** pool...
Dec 21, 2021 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T16:03:07.742Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 21, 2021 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-21T16:03:07.784Z: Worker pool stopped.
Dec 21, 2021 4:03:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-21_04_45_29-2661965857973093019 finished with status CANCELLED.
Load test results for test (ID): d613daa7-5758-459e-a0f6-4fa3f16ead4a and timestamp: 2021-12-21T12:45:23.430000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11544.156
dataflow_v2_java11_total_bytes_count             2.48444433E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211221124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:672d4bf1af73bfc4fa7a079394c5343c38f3e45a87dc95cfd4ab51152a93d7a8
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211221124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:672d4bf1af73bfc4fa7a079394c5343c38f3e45a87dc95cfd4ab51152a93d7a8]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211221124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:672d4bf1af73bfc4fa7a079394c5343c38f3e45a87dc95cfd4ab51152a93d7a8])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:672d4bf1af73bfc4fa7a079394c5343c38f3e45a87dc95cfd4ab51152a93d7a8
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:672d4bf1af73bfc4fa7a079394c5343c38f3e45a87dc95cfd4ab51152a93d7a8
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:672d4bf1af73bfc4fa7a079394c5343c38f3e45a87dc95cfd4ab51152a93d7a8].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 20m 11s
99 actionable tasks: 68 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4wppwxn4dvu4e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #186

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/186/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13233] Replace AWS API used to list shards from DescribeStream to


------------------------------------------
[...truncated 94.14 KB...]
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1252, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1298, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1247, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1007, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 947, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 918, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for jFormatString-3.0.0: http://www.gnu.org/licenses/lgpl.html. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1252, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1298, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1247, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1007, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 947, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 1414, in connect
    super().connect()
  File "/usr/lib/python3.8/http/client.py", line 918, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1397, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for spotbugs-annotations-4.0.6: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html. Retrying...
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1252, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1298, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1247, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1007, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 947, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 918, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for checkstyle-8.23: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt after 9 retries.
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1252, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1298, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1247, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1007, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 947, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 918, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1383, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for jFormatString-3.0.0: http://www.gnu.org/licenses/lgpl.html after 9 retries.
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1354, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1252, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1298, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1247, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1007, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 947, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 1414, in connect
    super().connect()
  File "/usr/lib/python3.8/http/client.py", line 918, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 808, in create_connection
    raise err
  File "/usr/lib/python3.8/socket.py", line 796, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 53, in pull_from_url
    url_read = urlopen(url)
  File "/usr/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1397, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.8/urllib/request.py", line 1357, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 101] Network is unreachable>
ERROR:root:Invalid url for spotbugs-annotations-4.0.6: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html after 9 retries.
ERROR:root:['checkstyle-8.23', 'jFormatString-3.0.0', 'spotbugs-annotations-4.0.6']
ERROR:root:**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]
INFO:root:pull_licenses_java.py failed. It took 1217.460185 seconds with 16 threads.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py",> line 314, in <module>
    raise RuntimeError('{n} error(s) occurred.'.format(n=len(error_msg)),
RuntimeError: ('1 error(s) occurred.', ['**************************************** Licenses were not able to be pulled automatically for some dependencies. Please search source code of the dependencies on the internet and add "license" and "notice" (if available) field to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> for each missing license. Dependency List: [checkstyle-8.23,jFormatString-3.0.0,spotbugs-annotations-4.0.6]'])

> Task :sdks:java:container:pullLicenses FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 20m 50s
95 actionable tasks: 64 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2o4huixk7xtf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #185

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/185/display/redirect>

Changes:


------------------------------------------
[...truncated 47.79 KB...]

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
94251b99fbf8: Preparing
cad9ca496af0: Preparing
1e4215f7537a: Preparing
853a66b06059: Preparing
a52a2a7ff3f4: Preparing
53a26582c8d4: Preparing
00e6dd15ed44: Preparing
c8d993d936fd: Preparing
52105e9cf5eb: Preparing
9867535ec95c: Preparing
a1d8393cccc8: Preparing
9d9cfabf0afd: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
52105e9cf5eb: Waiting
5c81f9330d99: Waiting
a1d8393cccc8: Waiting
9867535ec95c: Waiting
927f9fcef4cf: Waiting
9d9cfabf0afd: Waiting
e2e8c39e0f77: Waiting
a81f1846a0d2: Waiting
d3710de04cb3: Waiting
3b441d7cb46b: Waiting
00e6dd15ed44: Waiting
53a26582c8d4: Waiting
a52a2a7ff3f4: Pushed
cad9ca496af0: Pushed
1e4215f7537a: Pushed
853a66b06059: Pushed
94251b99fbf8: Pushed
c8d993d936fd: Pushed
00e6dd15ed44: Pushed
53a26582c8d4: Pushed
52105e9cf5eb: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
91f7336bbfff: Layer already exists
d3710de04cb3: Layer already exists
e2e8c39e0f77: Layer already exists
9d9cfabf0afd: Pushed
a1d8393cccc8: Pushed
9867535ec95c: Pushed
20211219124334: digest: sha256:3173322eaa58a48b69cacf223f1d87fbb7d8c293ef689b491d092edb1c24647d size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 19, 2021 12:45:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 19, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 19, 2021 12:45:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 19, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 19, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 19, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 1 seconds
Dec 19, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 19, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 3e48a11de31910b829385bc0ffb29fcc6abaee2b2471f2733373314c23e0bc9a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PkihHeMZELgpOFvA_7KfzGq67iskcfJzM3MxTCPgvJo.pb
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 19, 2021 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 19, 2021 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 19, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 19, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-19_04_45_25-8273944174357763602?project=apache-beam-testing
Dec 19, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-19_04_45_25-8273944174357763602
Dec 19, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-19_04_45_25-8273944174357763602
Dec 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-19T12:45:36.290Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-kr6o. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:41.151Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:41.883Z: Expanding SplittableParDo operations into optimizable parts.
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:41.930Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.022Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.093Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.124Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.188Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.295Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.328Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.379Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.406Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.441Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.466Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.507Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.539Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.568Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.603Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.636Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.669Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.705Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.737Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.763Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.793Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.816Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.846Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.884Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.915Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:42.947Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:43.021Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:43.055Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 19, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:45:43.429Z: Starting 5 ****s in us-central1-a...
Dec 19, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:46:07.886Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 19, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:46:35.714Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 19, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:47:36.137Z: Workers have started successfully.
Dec 19, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T12:47:36.170Z: Workers have started successfully.
Dec 19, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T16:00:31.374Z: Cancel request is committed for workflow job: 2021-12-19_04_45_25-8273944174357763602.
Dec 19, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T16:00:31.447Z: Cleaning up.
Dec 19, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T16:00:31.505Z: Stopping **** pool...
Dec 19, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T16:00:31.552Z: Stopping **** pool...
Dec 19, 2021 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T16:03:06.724Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 19, 2021 4:03:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-19T16:03:06.752Z: Worker pool stopped.
Dec 19, 2021 4:03:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-19_04_45_25-8273944174357763602 finished with status CANCELLED.
Load test results for test (ID): 758926bc-d3e1-44cd-a9ee-3af0cf403788 and timestamp: 2021-12-19T12:45:18.420000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11541.074
dataflow_v2_java11_total_bytes_count             2.67213413E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211219124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3173322eaa58a48b69cacf223f1d87fbb7d8c293ef689b491d092edb1c24647d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211219124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3173322eaa58a48b69cacf223f1d87fbb7d8c293ef689b491d092edb1c24647d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211219124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3173322eaa58a48b69cacf223f1d87fbb7d8c293ef689b491d092edb1c24647d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3173322eaa58a48b69cacf223f1d87fbb7d8c293ef689b491d092edb1c24647d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3173322eaa58a48b69cacf223f1d87fbb7d8c293ef689b491d092edb1c24647d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3173322eaa58a48b69cacf223f1d87fbb7d8c293ef689b491d092edb1c24647d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 55s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lkfa7lxgqh4hg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #184

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/184/display/redirect>

Changes:


------------------------------------------
[...truncated 48.53 KB...]
c5f64ca0958b: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
729213c8affb: Waiting
013a401b202f: Waiting
556d656d31ee: Waiting
42a2bbb24dda: Waiting
d60096506969: Waiting
91f7336bbfff: Waiting
a81f1846a0d2: Waiting
3b441d7cb46b: Waiting
c5f64ca0958b: Waiting
e2e8c39e0f77: Waiting
5c81f9330d99: Waiting
26add80d0857: Waiting
927f9fcef4cf: Waiting
5a9675b73172: Pushed
dafad1a5d5c8: Pushed
374dcdae9a1c: Pushed
729213c8affb: Pushed
b94371377b84: Pushed
f90fd49241ed: Pushed
013a401b202f: Pushed
42a2bbb24dda: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
556d656d31ee: Pushed
c5f64ca0958b: Pushed
26add80d0857: Pushed
d60096506969: Pushed
20211218124335: digest: sha256:d6a535b4748b1064dfaed5c40963664267214ce85025f9395ceed633d522792a size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 18, 2021 12:45:59 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 18, 2021 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 18, 2021 12:46:01 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 18, 2021 12:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 53b7448f8272c57275841216216b348c41d3ad246704ed53fca49c6fbeedc693> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-U7dEj4JyxXJ1hBIWIWs0jEHTrSRnBO1T_KScb77txpM.pb
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 18, 2021 12:46:06 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f5b9db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@507d64aa]
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 18, 2021 12:46:06 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5854a18, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5556bf]
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 18, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-18_04_46_06-16435623552373968227?project=apache-beam-testing
Dec 18, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-18_04_46_06-16435623552373968227
Dec 18, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-18_04_46_06-16435623552373968227
Dec 18, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-18T12:46:13.230Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-8oa2. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:17.355Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.056Z: Expanding SplittableParDo operations into optimizable parts.
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.086Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.124Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.174Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.194Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.242Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.314Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.339Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.360Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.380Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.403Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.446Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.474Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.497Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.518Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.541Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.582Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.600Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.622Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.643Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.667Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.692Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.738Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.765Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.784Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.805Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.830Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.850Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:18.894Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:19.166Z: Starting 5 ****s in us-central1-a...
Dec 18, 2021 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:46:43.157Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 18, 2021 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:47:04.744Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 18, 2021 12:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:48:00.068Z: Workers have started successfully.
Dec 18, 2021 12:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T12:48:00.116Z: Workers have started successfully.
Dec 18, 2021 1:52:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-18T13:52:20.370Z: Staged package error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar' is inaccessible.
Dec 18, 2021 1:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-18T13:52:22.825Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 18, 2021 1:55:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-18T13:55:22.343Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 18, 2021 1:58:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-18T13:58:20.435Z: Staged package error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar' is inaccessible.
Dec 18, 2021 1:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-18T13:58:22.797Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 18, 2021 2:01:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-18T14:01:24.722Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 18, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T16:00:30.419Z: Cancel request is committed for workflow job: 2021-12-18_04_46_06-16435623552373968227.
Dec 18, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T16:00:30.477Z: Cleaning up.
Dec 18, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T16:00:30.569Z: Stopping **** pool...
Dec 18, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T16:00:30.615Z: Stopping **** pool...
Dec 18, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T16:02:57.261Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 18, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-18T16:02:57.339Z: Worker pool stopped.
Dec 18, 2021 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-18_04_46_06-16435623552373968227 finished with status CANCELLED.
Load test results for test (ID): 031b0ca1-1ebc-4de9-a2d4-d7d3c4f5931b and timestamp: 2021-12-18T12:46:00.666000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11518.684
dataflow_v2_java11_total_bytes_count             3.08272235E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211218124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6a535b4748b1064dfaed5c40963664267214ce85025f9395ceed633d522792a
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211218124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6a535b4748b1064dfaed5c40963664267214ce85025f9395ceed633d522792a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211218124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6a535b4748b1064dfaed5c40963664267214ce85025f9395ceed633d522792a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6a535b4748b1064dfaed5c40963664267214ce85025f9395ceed633d522792a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6a535b4748b1064dfaed5c40963664267214ce85025f9395ceed633d522792a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d6a535b4748b1064dfaed5c40963664267214ce85025f9395ceed633d522792a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/g2qfngguyxniu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #183

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/183/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16061 from [BEAM-13428] [Playground] Integrate

[noreply] Clarify CoGroupByKey creates Iterable, not list. (#16099)

[noreply] [BEAM-12931] Allow for DoFn#getAllowedTimestampSkew() when checking the

[noreply] [BEAM-13467] Properly handle null argument types for logical types.

[noreply] [BEAM-10277] Initial implementation for encoding position in Python

[noreply] [BEAM-11545] State & timer for batched RPC calls pattern (#13643)

[noreply] Automatically prune local images before building an RC. (#16238)

[noreply] Add verbose error messages to container-related scripts. (#16056)

[noreply] [BEAM-13456] Rollback #15890 to fix timeout in Java PostCommit (#16257)

[noreply] [BEAM-13015] Add a state backed iterable that can be mutated under

[noreply] [BEAM-13388] Update Cloud DLP after breaking changes. (#16236)

[noreply] [BEAM-13434] Bump google pubsublite on master. (#16265)

[mmack] [adhoc] Fix guava imports in tests


------------------------------------------
[...truncated 48.92 KB...]
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
a4ed61e3b241: Waiting
26aca54b886b: Waiting
6fd958ea90e6: Waiting
2eca0713fd62: Waiting
96d460f945b4: Waiting
5c81f9330d99: Waiting
de5518a41be7: Waiting
d3710de04cb3: Waiting
a4b5b297c9a6: Waiting
927f9fcef4cf: Waiting
91f7336bbfff: Waiting
3b441d7cb46b: Waiting
5f39dae7ca77: Pushed
f3c91d198746: Pushed
2b3e06b4bb3a: Pushed
5f2c7671d676: Pushed
d1b9b5402f54: Pushed
a4b5b297c9a6: Pushed
de5518a41be7: Pushed
2eca0713fd62: Pushed
a4ed61e3b241: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
96d460f945b4: Pushed
d3710de04cb3: Layer already exists
3b441d7cb46b: Layer already exists
e2e8c39e0f77: Layer already exists
91f7336bbfff: Layer already exists
26aca54b886b: Pushed
6fd958ea90e6: Pushed
20211217124333: digest: sha256:3da14525b976f8e2f56ed6793ab3bd58b4a238bb2d965ae7e5f87ede13227d22 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 17, 2021 12:45:23 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 17, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 17, 2021 12:45:24 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 17, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash e72b00cfebc0709df0fda30f6949fce89db16b6fec385d9f81e73f700d4ad0e9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5ysAz-vAcJ3w_aMPaUn86J2xa2_sOF2fgec_cA1K0Ok.pb
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 17, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 17, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 17, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-17_04_45_29-12572704560890537700?project=apache-beam-testing
Dec 17, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-17_04_45_29-12572704560890537700
Dec 17, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-17_04_45_29-12572704560890537700
Dec 17, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-17T12:45:36.615Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-lbzj. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 17, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:41.924Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 17, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:42.657Z: Expanding SplittableParDo operations into optimizable parts.
Dec 17, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:42.687Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 17, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:42.764Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 17, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:42.980Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 17, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.016Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 17, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.067Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.166Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.211Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.266Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.299Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.335Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.366Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.401Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.427Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.465Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.490Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.519Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.544Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.579Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.610Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.643Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.676Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.710Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.742Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.776Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.850Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.885Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.906Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:43.950Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 17, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:44.333Z: Starting 5 ****s in us-central1-a...
Dec 17, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:45:55.150Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 17, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:46:29.848Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 17, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:47:23.994Z: Workers have started successfully.
Dec 17, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T12:47:24.019Z: Workers have started successfully.
Dec 17, 2021 2:06:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-17T14:06:48.546Z: Staged package slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar' is inaccessible.
Dec 17, 2021 2:06:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-17T14:06:48.796Z: Staged package slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar' is inaccessible.
Dec 17, 2021 2:06:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-17T14:06:48.981Z: Staged package xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar' is inaccessible.
Dec 17, 2021 2:06:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-17T14:06:49.043Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 17, 2021 2:09:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-17T14:09:48.484Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 17, 2021 2:12:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-17T14:12:48.335Z: Staged package slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-api-1.7.30-zboHlk0btAoHYUhcax6ML4_Z6x0ZxTkorA1_lRAQXFc.jar' is inaccessible.
Dec 17, 2021 2:12:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-17T14:12:48.378Z: Staged package slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/slf4j-jdk14-1.7.30-4PnbBJN49kZ5QXcUVJlSMyhft633LEZ-ZdryXmc6y6g.jar' is inaccessible.
Dec 17, 2021 2:12:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-17T14:12:48.515Z: Staged package xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/xz-1.5-hvMPqHdfo6Ys2znR7XimAZFkwQWIZASNQsvuJE4m6EA.jar' is inaccessible.
Dec 17, 2021 2:12:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-17T14:12:48.574Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 17, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T16:00:26.163Z: Cancel request is committed for workflow job: 2021-12-17_04_45_29-12572704560890537700.
Dec 17, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T16:00:26.223Z: Cleaning up.
Dec 17, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T16:00:26.304Z: Stopping **** pool...
Dec 17, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T16:00:26.387Z: Stopping **** pool...
Dec 17, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T16:03:00.156Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 17, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-17T16:03:00.224Z: Worker pool stopped.
Dec 17, 2021 4:03:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-17_04_45_29-12572704560890537700 finished with status CANCELLED.
Load test results for test (ID): 846c7e79-cc90-4b72-93cc-d692b6de4572 and timestamp: 2021-12-17T12:45:24.277000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11550.176
dataflow_v2_java11_total_bytes_count             2.08122015E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211217124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3da14525b976f8e2f56ed6793ab3bd58b4a238bb2d965ae7e5f87ede13227d22
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211217124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3da14525b976f8e2f56ed6793ab3bd58b4a238bb2d965ae7e5f87ede13227d22]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211217124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3da14525b976f8e2f56ed6793ab3bd58b4a238bb2d965ae7e5f87ede13227d22])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3da14525b976f8e2f56ed6793ab3bd58b4a238bb2d965ae7e5f87ede13227d22
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3da14525b976f8e2f56ed6793ab3bd58b4a238bb2d965ae7e5f87ede13227d22
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3da14525b976f8e2f56ed6793ab3bd58b4a238bb2d965ae7e5f87ede13227d22].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 55s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/txh6tnl4boxpk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #182

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/182/display/redirect?page=changes>

Changes:

[thiagotnunes] [BEAM-12164] Add Spanner Change Stream DAOs

[noreply] [BEAM-13341] Serialize reflection parameter in AvroCoder cloud object

[noreply] Update grafana from 8.1.6 to 8.1.8

[noreply] [BEAM-13015] Update FakeBeamFnStateClient to generate elements that stop

[noreply] [BEAM-13218] Sickbay

[noreply] [BEAM-13399] Add infrastructure to start JARs from Go functions (#16214)


------------------------------------------
[...truncated 49.25 KB...]
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
979821cbbfd3: Preparing
b145012a2908: Preparing
7ed69227c2c7: Preparing
7187d543cc5b: Preparing
d35cae3a437e: Preparing
9000f5ee7237: Preparing
b85415acf7ec: Preparing
a52437ae58f3: Preparing
6409b6c711e3: Preparing
b76032430ac0: Preparing
4bad0a5bdce6: Preparing
b85415acf7ec: Waiting
d51dd5d7ea8e: Preparing
5c81f9330d99: Preparing
a52437ae58f3: Waiting
6409b6c711e3: Waiting
927f9fcef4cf: Preparing
d51dd5d7ea8e: Waiting
5c81f9330d99: Waiting
a81f1846a0d2: Preparing
927f9fcef4cf: Waiting
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
d3710de04cb3: Waiting
91f7336bbfff: Waiting
e2e8c39e0f77: Waiting
d35cae3a437e: Pushed
7ed69227c2c7: Pushed
b145012a2908: Pushed
7187d543cc5b: Pushed
979821cbbfd3: Pushed
9000f5ee7237: Pushed
a52437ae58f3: Pushed
6409b6c711e3: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
4bad0a5bdce6: Pushed
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
b85415acf7ec: Pushed
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
d51dd5d7ea8e: Pushed
b76032430ac0: Pushed
20211216124331: digest: sha256:928bf6cb464d1d69a15b79707de65bca46c61955e9a231cc433259d92fd92918 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 16, 2021 12:45:29 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 16, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 16, 2021 12:45:30 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 16, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 16, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 16, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 16, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 16, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 19cd5482a8334519433fcb84863911c94794d7405b09df933ec542a57a44bfc0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Gc1UgqgzRRlDP8uEhjkRyUeU10BbCd-TPsVCpXpEv8A.pb
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 16, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 16, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-16_04_45_35-10483991347121633974?project=apache-beam-testing
Dec 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-16_04_45_35-10483991347121633974
Dec 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-16_04_45_35-10483991347121633974
Dec 16, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-16T12:45:42.646Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-7zc. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:47.213Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:47.946Z: Expanding SplittableParDo operations into optimizable parts.
Dec 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:47.979Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.035Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.101Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.133Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.201Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.318Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.371Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.436Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.469Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.503Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.538Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.564Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.587Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.623Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.647Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.678Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.711Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.744Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.769Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.810Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.835Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.861Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.888Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.915Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.948Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:48.983Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:49.016Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:49.056Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:45:49.517Z: Starting 5 ****s in us-central1-a...
Dec 16, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:46:02.319Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 16, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:46:38.599Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 16, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:47:35.246Z: Workers have started successfully.
Dec 16, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T12:47:35.284Z: Workers have started successfully.
Dec 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T16:00:24.576Z: Cancel request is committed for workflow job: 2021-12-16_04_45_35-10483991347121633974.
Dec 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T16:00:24.633Z: Cleaning up.
Dec 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T16:00:24.703Z: Stopping **** pool...
Dec 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T16:00:24.757Z: Stopping **** pool...
Dec 16, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T16:02:50.835Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 16, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-16T16:02:50.869Z: Worker pool stopped.
Dec 16, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-16_04_45_35-10483991347121633974 finished with status CANCELLED.
Load test results for test (ID): ba554a53-b20c-47fc-8030-b29d681a884c and timestamp: 2021-12-16T12:45:29.794000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11544.556
dataflow_v2_java11_total_bytes_count             2.40614275E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211216124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:928bf6cb464d1d69a15b79707de65bca46c61955e9a231cc433259d92fd92918
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211216124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:928bf6cb464d1d69a15b79707de65bca46c61955e9a231cc433259d92fd92918]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211216124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:928bf6cb464d1d69a15b79707de65bca46c61955e9a231cc433259d92fd92918])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:928bf6cb464d1d69a15b79707de65bca46c61955e9a231cc433259d92fd92918
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:928bf6cb464d1d69a15b79707de65bca46c61955e9a231cc433259d92fd92918
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:928bf6cb464d1d69a15b79707de65bca46c61955e9a231cc433259d92fd92918].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 42s
101 actionable tasks: 72 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bsx7pty42qhqo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #181

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/181/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13438][Playground]

[stranniknm] [BEAM-13446]: fix incorrect tab symbol

[noreply] Removes the comment that seems no longer relevant.

[Valentyn Tymofieiev] Update a few dependencies that may depend on log4j transitively.

[noreply] [BEAM-13159] Add Redis Stream (XADD) Write Support (#15858)

[noreply] Merge pull request #15994: [BEAM-13263] Support OnWindowExpiration in

[noreply] Bump dataflow container version to beam-master-20211213 (#16213)

[noreply] Merge pull request #16198 from [BEAM-13437][Playground] Add

[noreply] Merge pull request #16195 from a[BEAM-13436][Playground] Add

[noreply] [BEAM-13355] add Big Query parameter to enable users to specify load_…

[noreply] Updated the base images to use debian:bullseye (#16221)

[noreply] [BEAM-13434] Bump log4j to 2.16.0. (#16237)

[mmack] [BEAM-13209] Fix DynamoDBIO.write to properly handle partial success

[mmack] [BEAM-13209] Fix DynamoDBIO.write to properly handle partial success


------------------------------------------
[...truncated 48.51 KB...]
a6f21ca4c13a: Preparing
355bcc10b031: Preparing
6715dd53b547: Preparing
37871a55cfa4: Preparing
2c67ebf7a7fe: Preparing
cb734847c41f: Preparing
34516ab260ab: Preparing
240094277970: Preparing
066dc1a81373: Preparing
1aac0408ad5e: Preparing
1a4e4daffa72: Preparing
a10e6b6dd906: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
240094277970: Waiting
1a4e4daffa72: Waiting
066dc1a81373: Waiting
a10e6b6dd906: Waiting
1aac0408ad5e: Waiting
5c81f9330d99: Waiting
927f9fcef4cf: Waiting
cb734847c41f: Waiting
a81f1846a0d2: Waiting
3b441d7cb46b: Waiting
d3710de04cb3: Waiting
34516ab260ab: Waiting
91f7336bbfff: Waiting
e2e8c39e0f77: Waiting
2c67ebf7a7fe: Pushed
355bcc10b031: Pushed
6715dd53b547: Pushed
cb734847c41f: Pushed
a6f21ca4c13a: Pushed
37871a55cfa4: Pushed
240094277970: Pushed
066dc1a81373: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
1a4e4daffa72: Pushed
34516ab260ab: Pushed
a81f1846a0d2: Layer already exists
d3710de04cb3: Layer already exists
3b441d7cb46b: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
a10e6b6dd906: Pushed
1aac0408ad5e: Pushed
20211215124334: digest: sha256:805d323837d7649ae3ceecb2a5c90d3f21fa2c7bd3ede8482f2fea45c32e032e size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 15, 2021 12:45:38 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 15, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 15, 2021 12:45:39 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 15, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 15, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 15, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 15, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 15, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash d68f5f8d5e255a25c6e559ed626763190554af23f0e4c8a868c946a0d7e536a5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1o9fjV4lWiXG5VntYmdjGQVUryPw5MioaMlGoNflNqU.pb
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 15, 2021 12:45:44 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 15, 2021 12:45:44 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 15, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 15, 2021 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 15, 2021 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-15_04_45_45-17558208930821720948?project=apache-beam-testing
Dec 15, 2021 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-15_04_45_45-17558208930821720948
Dec 15, 2021 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-15_04_45_45-17558208930821720948
Dec 15, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-15T12:45:52.557Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-qwxw. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 15, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:57.454Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.081Z: Expanding SplittableParDo operations into optimizable parts.
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.113Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.170Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.241Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.288Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.352Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.485Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.524Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.557Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.591Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.624Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.651Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.675Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.734Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.769Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.796Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.827Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.861Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.892Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.925Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.959Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:58.994Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.021Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.044Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.077Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.120Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.140Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.169Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.197Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 15, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:45:59.644Z: Starting 5 ****s in us-central1-a...
Dec 15, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:46:06.577Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 15, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:46:48.870Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 15, 2021 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:47:46.059Z: Workers have started successfully.
Dec 15, 2021 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T12:47:46.094Z: Workers have started successfully.
Dec 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T16:00:33.178Z: Cancel request is committed for workflow job: 2021-12-15_04_45_45-17558208930821720948.
Dec 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T16:00:33.290Z: Cleaning up.
Dec 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T16:00:33.398Z: Stopping **** pool...
Dec 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T16:00:33.472Z: Stopping **** pool...
Dec 15, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T16:02:59.817Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 15, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-15T16:02:59.856Z: Worker pool stopped.
Dec 15, 2021 4:03:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-15_04_45_45-17558208930821720948 finished with status CANCELLED.
Load test results for test (ID): 3718cf26-2e64-4308-a399-4e20a0c9fa25 and timestamp: 2021-12-15T12:45:39.535000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11428.455
dataflow_v2_java11_total_bytes_count             1.90166544E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211215124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:805d323837d7649ae3ceecb2a5c90d3f21fa2c7bd3ede8482f2fea45c32e032e
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211215124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:805d323837d7649ae3ceecb2a5c90d3f21fa2c7bd3ede8482f2fea45c32e032e]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211215124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:805d323837d7649ae3ceecb2a5c90d3f21fa2c7bd3ede8482f2fea45c32e032e])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:805d323837d7649ae3ceecb2a5c90d3f21fa2c7bd3ede8482f2fea45c32e032e
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:805d323837d7649ae3ceecb2a5c90d3f21fa2c7bd3ede8482f2fea45c32e032e
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:805d323837d7649ae3ceecb2a5c90d3f21fa2c7bd3ede8482f2fea45c32e032e].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 51s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cjb5s4ekmw7vs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #180

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/180/display/redirect?page=changes>

Changes:

[Daniel Oliveira] [BEAM-13321] Pass TempLocation as pipeline option to Dataflow Go for

[Robert Bradshaw] Better type hints for Count combiners.

[Kyle Weaver] Include name of missing tag in error message.

[stranniknm] [BEAM-13423]: fix frontend failure if no examples

[daria.malkova] change return type of 2 methods

[mmack] [BEAM-13441] Use quiet delete for S3 batch deletes. In quiet mode only

[noreply] Updating Grafana from v8.1.2 to v8.1.6

[daria.malkova] Docs for validators tests

[daria.malkova] change context type

[noreply] Merge pull request #16140 from [BEAM-13377][Playground] Update CI/CD

[noreply] Merge pull request #16120 from [BEAM-13333][Playground] Save Python logs

[noreply] Merge pull request #16185 from [BEAM-13425][Playground][Bugfix] Support

[mmack] [BEAM-13445] Correctly set data limit when flushing S3 upload buffer and

[noreply] Merge pull request #16121 from [BEAM-13334][Playground] Save Go logs to

[noreply] Merge pull request #16179 from [BEAM-13344][Playground] support python

[noreply] Merge pull request #16208 from [BEAM-13442][Playground] Filepath to log

[noreply] [BEAM-13276] bump jackson-core to 2.13.0 for .test-infra (#16062)

[noreply] Change Pub/Sub Lite PollResult to set explicit watermark (#16216)

[noreply] [BEAM-13454] Fix and test dataframe read_fwf. (#16064)

[noreply] [BEAM-12976] Pipeline visitor to discover pushdown opportunities.

[noreply] [BEAM-13015] Allow decoding a set of elements until we hit the block


------------------------------------------
[...truncated 48.58 KB...]
91f7336bbfff: Waiting
78c3a7b74ad8: Waiting
1a7bf77856fc: Waiting
5626069a74e0: Waiting
e2e8c39e0f77: Waiting
db60aa4405a5: Pushed
3ac9c2fecc70: Pushed
cb89b8924d43: Pushed
752b3aca70b1: Pushed
afa057d8b3b8: Pushed
19bab224b506: Pushed
28c351f2219b: Pushed
78c3a7b74ad8: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
aeef33a16417: Pushed
0028bf7c6381: Pushed
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
1a7bf77856fc: Pushed
5626069a74e0: Pushed
20211214124339: digest: sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 14, 2021 12:45:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 14, 2021 12:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 14, 2021 12:45:59 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 14, 2021 12:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash cfa809eda01b4a61e8be5729d895f703dd9635230eb18f9a8af2c0a8d40a307b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-z6gJ7aAbSmHovlcp2JX3A92WNSMOsY-aivLAqNQKMHs.pb
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 14, 2021 12:46:04 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 14, 2021 12:46:04 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e]
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 14, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-14_04_46_04-10847864043267183166?project=apache-beam-testing
Dec 14, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-14_04_46_04-10847864043267183166
Dec 14, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-14_04_46_04-10847864043267183166
Dec 14, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-14T12:46:13.628Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-rstx. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 14, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:23.055Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 14, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.009Z: Expanding SplittableParDo operations into optimizable parts.
Dec 14, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.028Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.101Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.178Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.210Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.285Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.396Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.459Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.498Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.537Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.598Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.622Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.670Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.716Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.737Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.765Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.815Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.845Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.886Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.916Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:24.976Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.014Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.063Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.095Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.127Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.158Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.190Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.221Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.266Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:25.716Z: Starting 5 ****s in us-central1-a...
Dec 14, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:55.690Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 14, 2021 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:47:17.195Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 14, 2021 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:48:12.483Z: Workers have started successfully.
Dec 14, 2021 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:48:12.530Z: Workers have started successfully.
Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T16:00:32.573Z: Cancel request is committed for workflow job: 2021-12-14_04_46_04-10847864043267183166.
Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T16:00:32.675Z: Cleaning up.
Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T16:00:32.805Z: Stopping **** pool...
Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T16:00:32.878Z: Stopping **** pool...
Dec 14, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T16:02:58.021Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 14, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T16:02:58.066Z: Worker pool stopped.
Dec 14, 2021 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-14_04_46_04-10847864043267183166 finished with status CANCELLED.
Load test results for test (ID): 13c2cfc1-d692-4f89-91f5-e3d08ce48dfa and timestamp: 2021-12-14T12:45:59.353000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11458.167
dataflow_v2_java11_total_bytes_count               9.8452935E9
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615
Deleted: sha256:31984e3cd5f1effd1bd83c233ba0c78e20abcf7cd4e861d837ab4d3d7a601cca
Deleted: sha256:7fd662f7010957ea9bf06bd7b335722f5425f5945f1b6cc6feab97ac52e8e7fd
Deleted: sha256:4d4d41f751686de528528116d227644035570c1cd97b37e2a8fc057f3255a238
Deleted: sha256:b1cd886d55c654df34127fe34baea43da56864e78bd8334e683ae77875d579ee
Deleted: sha256:6cac39d37c9cd6d832a471a67b21f82cea908b29ed529d18a28e98fb71187c38
Deleted: sha256:abc187c15a666a531d7b847e877975e16c07725ef8767bfe06da04752e67a56e
Deleted: sha256:32ff2e0b2822bc7a8a2071756d89f88ade4ce577aed6bf7a7ce31d90ba210748
Deleted: sha256:27bdec2438eaa53e37f2b4a553f9707b0d3bdecfae90f21a68cac6c523940144
Deleted: sha256:7e8e36f085f7cf3e8956b3bfc793a2b02a56c2ac02dc59f085176da8671725a3
Deleted: sha256:93a07ba63b9743f9a6cada9153e29b0e758df3bcff1fb2a90841c118da9edd44
Deleted: sha256:f50a1ee842578cc7b5f3d913bee9a4fbb4d86328d385d08089531af1d3d4f5b8
Deleted: sha256:e199d0e1d081f216364c30f580245c8cdb9db6e689151bc793cb2cda03e9f397
Deleted: sha256:42892fa12be941b704b86cb93cc9e2139322b7c45f4089b886ca89ea055ee6a6
Deleted: sha256:ff7bf5914e1b701f46657e3bcf9b6f6a6fdaff2c75b540bc12ff3f054321747a
Deleted: sha256:46d1b1c2bff7d6ade58020d09acacc3bc18d3b02622d0ea58722c8c996ecf650
Deleted: sha256:f814d152d91795c598d29fb102db03bbf5151b48a024179e2aae3f59650c44a8
Deleted: sha256:6661f2432f0a2d022d406c9a506f7cbc19fa30a84d0a692fffe816142af49cb8
Deleted: sha256:e261346411209ddcad0986f86b18732b00d752be3e07f8712c5cc9a83e9d68c0
Deleted: sha256:c3b179df6819bb1c89c64d2323066dda947d731827c0fc910b1771fd3babf718
Deleted: sha256:97bce8c173e989faa9c6b4a92db55d37e3c7300324a2c3787328833058e8310e
Deleted: sha256:50f45e4b8146d82054d62f41c00de62a55d34918ecbc435fe6771daacb327c8d
Deleted: sha256:b72af5263a1234b9da845a5ed2eec5385fb5af6bafda0ef97ed62716b6a1daf8
Deleted: sha256:6fff26c6d7afe5201f873571c0985c804e842f6d38c75668705dc18afba2060a
Deleted: sha256:30bfe25a08623fd67fa95e4486c93c14b2f2e42a6c52f2dfde8f711953017b17
Deleted: sha256:e8f9be52dfdcf23f1ad37027bf2ecbcbf928be8e2ac3a11b34e1f37e26911cdb
Deleted: sha256:8712d68b22b4e09124a8acea3fe418cd751713e1c17343ea7d26fa783c834a28
Deleted: sha256:5cb8d21e7d1bb69f836eed28a20d53e4aa42076bdd1484ee9fc65e4e1ff5cc22
Deleted: sha256:033cdcc5ad9788237d1333d911b1bbda210747136cbf0e29a4500410f955db70
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dj6jyepho2omm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #179

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/179/display/redirect>

Changes:


------------------------------------------
[...truncated 47.91 KB...]
c3b63d9fc1de: Preparing
ae06d66027ce: Preparing
a18cd92eaadd: Preparing
b7e6ea413031: Preparing
12dc5cb5c701: Preparing
c5af9847714e: Preparing
11ddc425808c: Preparing
2dc9c8e01d86: Preparing
7c5469ff15b9: Preparing
cfea4461c462: Preparing
fc3cb2ae0622: Preparing
2c94c0d5ec2a: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
5c81f9330d99: Waiting
c5af9847714e: Waiting
11ddc425808c: Waiting
2dc9c8e01d86: Waiting
7c5469ff15b9: Waiting
927f9fcef4cf: Waiting
a81f1846a0d2: Waiting
3b441d7cb46b: Waiting
d3710de04cb3: Waiting
cfea4461c462: Waiting
91f7336bbfff: Waiting
e2e8c39e0f77: Waiting
fc3cb2ae0622: Waiting
2c94c0d5ec2a: Waiting
a18cd92eaadd: Pushed
ae06d66027ce: Pushed
12dc5cb5c701: Pushed
c3b63d9fc1de: Pushed
b7e6ea413031: Pushed
c5af9847714e: Pushed
2dc9c8e01d86: Pushed
7c5469ff15b9: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
fc3cb2ae0622: Pushed
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
11ddc425808c: Pushed
2c94c0d5ec2a: Pushed
cfea4461c462: Pushed
20211213124334: digest: sha256:a96d644e82f013762c58cbeed13ed6efaf9972ee34c6e1cd7455717bf792f725 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 13, 2021 12:45:31 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 13, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 13, 2021 12:45:32 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 13, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 13, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 13, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 13, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 13, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 427c36924631917117938c224b9e85327c30559c04a58a5931bf5fcf6e0bfd8e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Qnw2kkYxkXEXk4wiS56FMnwwVZwEpYpZMb9fz24L_Y4.pb
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 13, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 13, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 13, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-13_04_45_38-8427878871215758655?project=apache-beam-testing
Dec 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-13_04_45_38-8427878871215758655
Dec 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-13_04_45_38-8427878871215758655
Dec 13, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-13T12:45:44.978Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-51om. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 13, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:50.828Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.392Z: Expanding SplittableParDo operations into optimizable parts.
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.416Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.481Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.561Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.590Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.646Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.726Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.751Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.774Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.815Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.874Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.904Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.926Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.947Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:51.979Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.002Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.025Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.066Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.097Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.128Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.159Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.188Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.210Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.235Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.258Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.299Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.333Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.367Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 13, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.390Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 13, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:45:52.836Z: Starting 5 ****s in us-central1-a...
Dec 13, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:46:23.770Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 13, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:46:45.413Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 13, 2021 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:47:42.506Z: Workers have started successfully.
Dec 13, 2021 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T12:47:42.544Z: Workers have started successfully.
Dec 13, 2021 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T16:00:41.242Z: Cancel request is committed for workflow job: 2021-12-13_04_45_38-8427878871215758655.
Dec 13, 2021 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T16:00:41.343Z: Cleaning up.
Dec 13, 2021 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T16:00:41.422Z: Stopping **** pool...
Dec 13, 2021 4:00:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T16:00:41.477Z: Stopping **** pool...
Dec 13, 2021 4:03:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T16:03:02.467Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 13, 2021 4:03:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-13T16:03:02.512Z: Worker pool stopped.
Dec 13, 2021 4:03:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-13_04_45_38-8427878871215758655 finished with status CANCELLED.
Load test results for test (ID): e91ae5fc-48e2-4750-8649-239f3cf28650 and timestamp: 2021-12-13T12:45:32.427000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11553.599
dataflow_v2_java11_total_bytes_count             1.37818616E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211213124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a96d644e82f013762c58cbeed13ed6efaf9972ee34c6e1cd7455717bf792f725
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211213124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a96d644e82f013762c58cbeed13ed6efaf9972ee34c6e1cd7455717bf792f725]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211213124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a96d644e82f013762c58cbeed13ed6efaf9972ee34c6e1cd7455717bf792f725])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a96d644e82f013762c58cbeed13ed6efaf9972ee34c6e1cd7455717bf792f725
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a96d644e82f013762c58cbeed13ed6efaf9972ee34c6e1cd7455717bf792f725
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a96d644e82f013762c58cbeed13ed6efaf9972ee34c6e1cd7455717bf792f725].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 53s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wny2bv5m3y2js

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #178

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/178/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Plumb the cache through contexts and transform executors.


------------------------------------------
[...truncated 48.47 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
bdb7d24d918e: Preparing
11b84a6aa952: Preparing
d4c1a9bb5a7d: Preparing
a655e3d39ede: Preparing
b2e27d664be5: Preparing
facb02d1b60d: Preparing
78adbc62f53b: Preparing
7ed4397a3936: Preparing
bb796e33bed4: Preparing
704cd2e7c191: Preparing
1585f76ccba2: Preparing
c6b20a252f15: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
bb796e33bed4: Waiting
704cd2e7c191: Waiting
5c81f9330d99: Waiting
d3710de04cb3: Waiting
1585f76ccba2: Waiting
c6b20a252f15: Waiting
91f7336bbfff: Waiting
e2e8c39e0f77: Waiting
927f9fcef4cf: Waiting
a81f1846a0d2: Waiting
3b441d7cb46b: Waiting
7ed4397a3936: Waiting
78adbc62f53b: Waiting
11b84a6aa952: Pushed
d4c1a9bb5a7d: Pushed
b2e27d664be5: Pushed
bdb7d24d918e: Pushed
facb02d1b60d: Pushed
a655e3d39ede: Pushed
7ed4397a3936: Pushed
bb796e33bed4: Pushed
1585f76ccba2: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
e2e8c39e0f77: Layer already exists
91f7336bbfff: Layer already exists
78adbc62f53b: Pushed
c6b20a252f15: Pushed
704cd2e7c191: Pushed
20211212124331: digest: sha256:998f5f145ac83bda9ffc267aaf52ecee8dcf982df2e5f1512f38911771983144 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 12, 2021 12:45:24 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 12, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 12, 2021 12:45:25 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 12, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 12, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 12, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 12, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 12, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 5d9254e89b25ac2c37390ef094ab6858354b518890d68901dc0ac6d816123d44> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XZJU6JslrCw3OQ7wlKtoWDVLUYiQ1okB3ArG2BYSPUQ.pb
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 12, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 12, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 12, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 12, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-12_04_45_30-13716156934319324213?project=apache-beam-testing
Dec 12, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-12_04_45_30-13716156934319324213
Dec 12, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-12_04_45_30-13716156934319324213
Dec 12, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-12T12:45:39.271Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-maqb. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 12, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:42.606Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 12, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.302Z: Expanding SplittableParDo operations into optimizable parts.
Dec 12, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.335Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 12, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.406Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.518Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.561Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.622Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.749Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.778Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.808Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.834Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.870Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.911Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.957Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:43.994Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.024Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.092Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.118Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.151Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.183Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.203Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.234Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.265Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.288Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.309Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.336Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.395Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.446Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.470Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 12, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.522Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 12, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:45:44.945Z: Starting 5 ****s in us-central1-a...
Dec 12, 2021 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:46:01.111Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 12, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:46:32.025Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 12, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:47:33.770Z: Workers have started successfully.
Dec 12, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T12:47:33.807Z: Workers have started successfully.
Dec 12, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T16:00:33.061Z: Cancel request is committed for workflow job: 2021-12-12_04_45_30-13716156934319324213.
Dec 12, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T16:00:33.145Z: Cleaning up.
Dec 12, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T16:00:33.231Z: Stopping **** pool...
Dec 12, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T16:00:33.293Z: Stopping **** pool...
Dec 12, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T16:03:00.132Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 12, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-12T16:03:00.188Z: Worker pool stopped.
Dec 12, 2021 4:03:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-12_04_45_30-13716156934319324213 finished with status CANCELLED.
Load test results for test (ID): c46ba322-b8ed-40fd-8c55-b02c55d5d85e and timestamp: 2021-12-12T12:45:25.604000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11552.879
dataflow_v2_java11_total_bytes_count             1.24155733E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211212124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:998f5f145ac83bda9ffc267aaf52ecee8dcf982df2e5f1512f38911771983144
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211212124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:998f5f145ac83bda9ffc267aaf52ecee8dcf982df2e5f1512f38911771983144]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211212124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:998f5f145ac83bda9ffc267aaf52ecee8dcf982df2e5f1512f38911771983144])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:998f5f145ac83bda9ffc267aaf52ecee8dcf982df2e5f1512f38911771983144
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:998f5f145ac83bda9ffc267aaf52ecee8dcf982df2e5f1512f38911771983144
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:998f5f145ac83bda9ffc267aaf52ecee8dcf982df2e5f1512f38911771983144].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 51s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qu2jacavsjiu2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #177

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/177/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13424][Playground]

[aydar.zaynutdinov] [BEAM-13424][Playground]

[stranniknm] [BEAM-13426] clear log and output panels on example change

[aydar.zaynutdinov] [BEAM-13428][Playground]

[aydar.zaynutdinov] [BEAM-13424][Playground]

[daria.malkova] add type of precompiled object

[Alexey Romanenko] [BEAM-13171] Fix Duration ambiguity

[mmack] [BEAM-13410][BEAM-11788] Integration test for S3Filesystem using

[Alexey Romanenko] [BEAM-13434] Bump up Apache log4j2 vulnerability to 2.15.0

[noreply] Merge pull request #16138 from [BEAM-13343][Playground] Support go unit

[noreply] [BEAM-12560] Dataframe idxmin and idxmax implementation (#15827)

[noreply] [BEAM-12565] Dataframe compare implementation (#16027)

[noreply] [BEAM-12976] Use a map to pass all pushdown requests at once. (#16189)

[noreply] [BEAM-12683]  Fix failing integration tests for Python Recommendation AI


------------------------------------------
[...truncated 49.27 KB...]
153afbc36442: Waiting
b5b597fd8241: Waiting
46804051de77: Waiting
0f0e7b544a14: Waiting
5c81f9330d99: Waiting
a81f1846a0d2: Waiting
d3710de04cb3: Waiting
5eb36e995c47: Waiting
3b441d7cb46b: Waiting
91f7336bbfff: Waiting
927f9fcef4cf: Waiting
e9a1ac956616: Pushed
189ba2214737: Pushed
dcb926f09b43: Pushed
4df5ad5e1754: Pushed
0f0e7b544a14: Pushed
6f85da26d475: Pushed
db1e8994175d: Pushed
79642cf2a3d7: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
153afbc36442: Pushed
3b441d7cb46b: Layer already exists
5eb36e995c47: Pushed
46804051de77: Pushed
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
b5b597fd8241: Pushed
20211211124333: digest: sha256:9ac788139e4d04ce1bc9fb2b5f402a69fd9f83c427e8008bc77be5e9c5d5cfba size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 11, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 11, 2021 12:45:28 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 11, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash a24d499f43aa5afb43675ad7cfbe5e8c97a1d32636640a9e741dc88edd8e4ebc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ok1Jn0OqWvtDZ1rXz75ejJeh0yY2ZAqedB3Ijt2OTrw.pb
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 11, 2021 12:45:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 11, 2021 12:45:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 11, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 11, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-11_04_45_33-8265912439203147639?project=apache-beam-testing
Dec 11, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-11_04_45_33-8265912439203147639
Dec 11, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-11_04_45_33-8265912439203147639
Dec 11, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-11T12:45:40.772Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-nrkf. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:45.560Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.279Z: Expanding SplittableParDo operations into optimizable parts.
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.321Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.414Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.619Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.712Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.794Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.938Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:46.991Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.054Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.104Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.144Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.198Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.241Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.282Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.327Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.368Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.401Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.437Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.471Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.504Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.536Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.570Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.611Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.646Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.677Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.714Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.737Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.780Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:47.831Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:48.285Z: Starting 5 ****s in us-central1-a...
Dec 11, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:45:59.401Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 11, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:46:35.464Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 11, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:47:32.918Z: Workers have started successfully.
Dec 11, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T12:47:32.952Z: Workers have started successfully.
Dec 11, 2021 2:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:48.458Z: Staged package amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar' is inaccessible.
Dec 11, 2021 2:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:48.823Z: Staged package aws-java-sdk-cloudwatch-1.12.106-kUP1xZSNspLnoAnC0aAzibk9w0IXNbBNeMBESBYiZtg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-cloudwatch-1.12.106-kUP1xZSNspLnoAnC0aAzibk9w0IXNbBNeMBESBYiZtg.jar' is inaccessible.
Dec 11, 2021 2:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:48.894Z: Staged package aws-java-sdk-core-1.12.106-2a5g2j72YwhD9uNhLJWqRljsgds1wEVsdH_XLgZZOlQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-core-1.12.106-2a5g2j72YwhD9uNhLJWqRljsgds1wEVsdH_XLgZZOlQ.jar' is inaccessible.
Dec 11, 2021 2:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:48.998Z: Staged package aws-java-sdk-dynamodb-1.12.106-s8z13aUxEWhNzx7J8mY6rVcDrK2iLAcBqT3Pb9sNgmM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-dynamodb-1.12.106-s8z13aUxEWhNzx7J8mY6rVcDrK2iLAcBqT3Pb9sNgmM.jar' is inaccessible.
Dec 11, 2021 2:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:49.096Z: Staged package aws-java-sdk-kinesis-1.12.106-wMPghgT8YvcVokv233DC2r8Q0eeaqh7phw3Anmz6S4o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kinesis-1.12.106-wMPghgT8YvcVokv233DC2r8Q0eeaqh7phw3Anmz6S4o.jar' is inaccessible.
Dec 11, 2021 2:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:49.186Z: Staged package aws-java-sdk-kms-1.12.106-JkHOiVwKCYdQ3d-wR7YYej-O0UmkZOioL923IYI9PwM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kms-1.12.106-JkHOiVwKCYdQ3d-wR7YYej-O0UmkZOioL923IYI9PwM.jar' is inaccessible.
Dec 11, 2021 2:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:49.274Z: Staged package aws-java-sdk-s3-1.12.106-5YV0cnyV551lh5cwCZGFhypcyG7NsqTcEP0IBjaAnjE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-s3-1.12.106-5YV0cnyV551lh5cwCZGFhypcyG7NsqTcEP0IBjaAnjE.jar' is inaccessible.
Dec 11, 2021 2:24:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:52.095Z: Staged package jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar' is inaccessible.
Dec 11, 2021 2:24:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:52.184Z: Staged package jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar' is inaccessible.
Dec 11, 2021 2:24:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-11T14:24:52.291Z: Staged package jmespath-java-1.12.106-_mWkV0KPXB56ULuR7j6dxTg6hrDhcycHHtCXYBv9V7s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jmespath-java-1.12.106-_mWkV0KPXB56ULuR7j6dxTg6hrDhcycHHtCXYBv9V7s.jar' is inaccessible.
Dec 11, 2021 2:24:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-11T14:24:53.513Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 11, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T16:00:36.512Z: Cancel request is committed for workflow job: 2021-12-11_04_45_33-8265912439203147639.
Dec 11, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T16:00:36.560Z: Cleaning up.
Dec 11, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T16:00:36.692Z: Stopping **** pool...
Dec 11, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T16:00:36.737Z: Stopping **** pool...
Dec 11, 2021 4:03:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T16:03:08.331Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 11, 2021 4:03:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-11T16:03:08.390Z: Worker pool stopped.
Dec 11, 2021 4:03:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-11_04_45_33-8265912439203147639 finished with status CANCELLED.
Load test results for test (ID): a7ec4f31-1219-4c92-83ee-701fcf7fb5ca and timestamp: 2021-12-11T12:45:28.549000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11539.658
dataflow_v2_java11_total_bytes_count             2.05228429E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211211124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ac788139e4d04ce1bc9fb2b5f402a69fd9f83c427e8008bc77be5e9c5d5cfba
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211211124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ac788139e4d04ce1bc9fb2b5f402a69fd9f83c427e8008bc77be5e9c5d5cfba]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211211124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ac788139e4d04ce1bc9fb2b5f402a69fd9f83c427e8008bc77be5e9c5d5cfba])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ac788139e4d04ce1bc9fb2b5f402a69fd9f83c427e8008bc77be5e9c5d5cfba
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ac788139e4d04ce1bc9fb2b5f402a69fd9f83c427e8008bc77be5e9c5d5cfba
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ac788139e4d04ce1bc9fb2b5f402a69fd9f83c427e8008bc77be5e9c5d5cfba].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 59s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kbe3wuhsiqndw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #176

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/176/display/redirect?page=changes>

Changes:

[slebedev] [BEAM-13384] Re-exported the metrics subpackage in

[zyichi] [BEAM-13401] Wait for worker start to reduce flakiness in

[zyichi] Fix pytest unknown markers warning.

[noreply] [BEAM-11936] Remove suppression in ReadFromKafkaDoFn (#16174)

[noreply] [BEAM-12561] method truncate on series and dataframe (#15833)

[noreply] [BEAM-13294] Widen key schema for all keys before use (#16158)

[noreply] [BEAM-13171] Support for stopReadTime on KafkaIO SDF (#15951)

[noreply] [BEAM-13399] Add functionality to download Beam JARs from Maven 

[noreply] [BEAM-13402] Add another workaround for


------------------------------------------
[...truncated 49.96 KB...]
bb4cbb83d9b0: Pushed
bf3006ee3329: Pushed
a610e68468f8: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
a5e4846680ab: Pushed
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
b110851a7e00: Pushed
0a3213d92595: Pushed
9ba9cab944bf: Pushed
20211210124332: digest: sha256:ab60f3f67a394688a065bc59fab36c1a9e596035691253dfe305d091bb97fcc9 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 10, 2021 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 10, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 10, 2021 12:45:28 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 10, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 10, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 10, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 10, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 10, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 9833b2477d8cb7464bc8248aa2e368e67d7aabdd2aa80e1ca78491e3babb1760> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mDOyR32Mt0ZLyCSKouNo5n16q90qqA4cp4SR47q7F2A.pb
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 10, 2021 12:45:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d]
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 10, 2021 12:45:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a37191a]
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 10, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-10_04_45_33-8039644437749157402?project=apache-beam-testing
Dec 10, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-10_04_45_33-8039644437749157402
Dec 10, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-10_04_45_33-8039644437749157402
Dec 10, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-10T12:45:44.540Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-dp3d. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 10, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:50.739Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 10, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:51.747Z: Expanding SplittableParDo operations into optimizable parts.
Dec 10, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:51.873Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 10, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:52.358Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:52.757Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:52.869Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:53.246Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:53.879Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:54.076Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:54.243Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:54.341Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:54.476Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:54.667Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 10, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:54.881Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:55.217Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:55.362Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:55.951Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:56.160Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:56.267Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:56.553Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:56.654Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 10, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:56.777Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.006Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.239Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.344Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.449Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.558Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.674Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.863Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:57.965Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 10, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:45:59.103Z: Starting 5 ****s in us-central1-a...
Dec 10, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:46:08.924Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 10, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:46:39.650Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 10, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:47:40.140Z: Workers have started successfully.
Dec 10, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T12:47:40.187Z: Workers have started successfully.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:00.107Z: Staged package annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar' is inaccessible.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:00.437Z: Staged package avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar' is inaccessible.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:01.090Z: Staged package beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar' is inaccessible.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:01.239Z: Staged package beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar' is inaccessible.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:01.386Z: Staged package checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar' is inaccessible.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:01.702Z: Staged package commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar' is inaccessible.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:01.846Z: Staged package commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar' is inaccessible.
Dec 10, 2021 1:16:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:01.936Z: Staged package commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:03.464Z: Staged package hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:03.583Z: Staged package hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:03.746Z: Staged package j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:03.886Z: Staged package jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:03.995Z: Staged package jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:04.087Z: Staged package javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:04.187Z: Staged package json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar' is inaccessible.
Dec 10, 2021 1:16:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-10T13:16:04.260Z: Staged package jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar' is inaccessible.
Dec 10, 2021 1:16:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-10T13:16:05.967Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 10, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T16:00:32.457Z: Cancel request is committed for workflow job: 2021-12-10_04_45_33-8039644437749157402.
Dec 10, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T16:00:32.533Z: Cleaning up.
Dec 10, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T16:00:32.709Z: Stopping **** pool...
Dec 10, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T16:00:32.781Z: Stopping **** pool...
Dec 10, 2021 4:03:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T16:03:01.029Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 10, 2021 4:03:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-10T16:03:01.216Z: Worker pool stopped.
Dec 10, 2021 4:03:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-10_04_45_33-8039644437749157402 finished with status CANCELLED.
Load test results for test (ID): ce69be53-432f-4e46-a534-1b78b7868de6 and timestamp: 2021-12-10T12:45:28.320000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11534.626
dataflow_v2_java11_total_bytes_count             2.12050114E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211210124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab60f3f67a394688a065bc59fab36c1a9e596035691253dfe305d091bb97fcc9
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211210124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab60f3f67a394688a065bc59fab36c1a9e596035691253dfe305d091bb97fcc9]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211210124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab60f3f67a394688a065bc59fab36c1a9e596035691253dfe305d091bb97fcc9])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab60f3f67a394688a065bc59fab36c1a9e596035691253dfe305d091bb97fcc9
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab60f3f67a394688a065bc59fab36c1a9e596035691253dfe305d091bb97fcc9
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab60f3f67a394688a065bc59fab36c1a9e596035691253dfe305d091bb97fcc9].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c1d0672425677151b308da8514055cc06bccff2f62e23ecec8908f3ef953eac4
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c1d0672425677151b308da8514055cc06bccff2f62e23ecec8908f3ef953eac4
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c1d0672425677151b308da8514055cc06bccff2f62e23ecec8908f3ef953eac4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 54s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/52do64cfepmyo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #175

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/175/display/redirect?page=changes>

Changes:

[stranniknm] fix playground frontend licences

[noreply] Merge pull request #16167 from [BEAM-13409][Playground] [Bugfix] Change

[noreply] Merge pull request #16136 from [BEAM-13365] [Playground] Add Pipelines

[noreply] [BEAM-13244] Support STS Assume role credentials provider for AWS SDK v2

[noreply] [BEAM-11936] Fix errorprone UnusedVariable in core,examples,harness..

[noreply] [BEAM-11936] Remove suppressUnusedVariable flag (#16171)

[noreply] [BEAM-13090] Adding SDK harness container overrides option to Java SDK

[noreply] [BEAM-11936] Fix errorprone warnings (#15890)

[noreply] [BEAM-13015] Start integrating a process wide cache. (#16130)


------------------------------------------
[...truncated 61.38 KB...]
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:52.921Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:52.946Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:52.977Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.001Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.047Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.076Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.102Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.121Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.150Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.178Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.199Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.225Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.247Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.273Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.301Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.325Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.348Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.375Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 09, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:45:53.815Z: Starting 5 ****s in us-central1-a...
Dec 09, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:46:11.514Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 09, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:46:45.472Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 09, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:47:41.493Z: Workers have started successfully.
Dec 09, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T12:47:41.522Z: Workers have started successfully.
Dec 09, 2021 1:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:55.069Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Dec 09, 2021 1:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:55.277Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.602Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.658Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.715Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.767Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.811Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.856Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.897Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.947Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:57.998Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:58.063Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:58.110Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:58.233Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:58.365Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:45:58.832Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Dec 09, 2021 1:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-09T13:45:58.957Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 09, 2021 1:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-09T13:48:58.087Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 09, 2021 1:51:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:55.096Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Dec 09, 2021 1:51:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:55.238Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.242Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.304Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.347Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.386Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.426Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.466Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.501Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.535Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.572Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.639Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.675Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.777Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:57.814Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:51:58.290Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Dec 09, 2021 1:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-09T13:51:58.494Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 09, 2021 1:54:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-09T13:54:58.170Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 09, 2021 1:57:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:55.270Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Dec 09, 2021 1:57:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:55.431Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.432Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.481Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.519Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.560Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.608Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.656Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.703Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.754Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Dec 09, 2021 1:57:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.796Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Dec 09, 2021 1:58:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.862Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Dec 09, 2021 1:58:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:57.929Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Dec 09, 2021 1:58:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:58.020Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Dec 09, 2021 1:58:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:58.072Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Dec 09, 2021 1:58:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T13:57:58.620Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Dec 09, 2021 1:58:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-09T13:57:58.777Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 09, 2021 2:00:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-09T14:00:58.659Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 09, 2021 2:03:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:55.100Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Dec 09, 2021 2:03:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:55.235Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.402Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.458Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.506Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.551Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.605Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.643Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.687Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.729Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.780Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.836Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.873Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:57.987Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:58.021Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-09T14:03:58.461Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Dec 09, 2021 2:03:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-09T14:03:58.657Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 09, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T16:00:35.716Z: Cancel request is committed for workflow job: 2021-12-09_04_45_32-5426460733272879647.
Dec 09, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T16:00:35.775Z: Cleaning up.
Dec 09, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T16:00:35.840Z: Stopping **** pool...
Dec 09, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T16:00:35.883Z: Stopping **** pool...
Dec 09, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T16:02:57.594Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 09, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-09T16:02:57.645Z: Worker pool stopped.
Dec 09, 2021 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-09_04_45_32-5426460733272879647 finished with status CANCELLED.
Load test results for test (ID): 8e65f303-f75b-47e3-98fb-ae3c0a81809a and timestamp: 2021-12-09T12:45:27.262000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11485.904
dataflow_v2_java11_total_bytes_count             2.31259995E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211209124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fc362308a5b8eeb128bfb04e8dd64602273890055124df7791dc1c938246498
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211209124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fc362308a5b8eeb128bfb04e8dd64602273890055124df7791dc1c938246498]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211209124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fc362308a5b8eeb128bfb04e8dd64602273890055124df7791dc1c938246498])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fc362308a5b8eeb128bfb04e8dd64602273890055124df7791dc1c938246498
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fc362308a5b8eeb128bfb04e8dd64602273890055124df7791dc1c938246498
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fc362308a5b8eeb128bfb04e8dd64602273890055124df7791dc1c938246498].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 48s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/woqhzjvimojh4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #174

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/174/display/redirect?page=changes>

Changes:

[heejong] [BEAM-13092] Adding dummy external transform translators for Dataflow

[stranniknm] [BEAM-13112]: playground embedded version

[zyichi] [BEAM-13388] Fix broken google cloud dlp test.

[relax] don't store unserializable DatasetService as a member variable

[noreply] [BEAM-13236] Properly close kinesis producer on teardown (#15955)

[noreply] Merge pull request #16150 from [BEAM-13396][Playground][Bugfix] Issues

[noreply] Merge pull request #16148 from [BEAM-13394][Playground] [Bugfix] Fix

[noreply] Avoid overriding explicit portable job submission disabling. (#16143)

[noreply] Merge pull request #16151 from [BEAM-13350][Playground] Support running

[zyichi] [BEAM-13373] Increase python post commit timeout to reduce chance of

[melissapa] [BEAM-11758] Final cleanup for Beam Basics doc content

[msbukal] Exclude FhirIOPatientEverything from v2 dataflow runner intg test.

[noreply] [BEAM-13371] Fix bug where DataFrame overview snippets don't show up

[dpcollins] Add a workaround for https://github.com/googleapis/gax-java/issues/1577

[noreply] [BEAM-12976] Implement pipeline visitor to get global field access in…

[noreply] [BEAM-13388] Use 3.0.0 as lower bound for google-cloud-dlp (#16164)

[noreply] Merge pull request #16127 from [BEAM-13366] [Playground] Add support


------------------------------------------
[...truncated 48.88 KB...]
a81f1846a0d2: Preparing
aa7ec7dfc8db: Waiting
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
927f9fcef4cf: Waiting
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
3b441d7cb46b: Waiting
d3710de04cb3: Waiting
91f7336bbfff: Waiting
e2e8c39e0f77: Waiting
5c81f9330d99: Waiting
27fdb79dd13a: Waiting
f1bf480ea614: Waiting
9d28abd78e00: Waiting
40d28832fb02: Waiting
a81f1846a0d2: Waiting
14c6fe2aabd9: Waiting
9c4912baf16d: Pushed
40287395a625: Pushed
5cb939ca3ff9: Pushed
e211da1ccfa6: Pushed
27fdb79dd13a: Pushed
35e0c0e168c1: Pushed
9d28abd78e00: Pushed
a6d14faf6ddb: Pushed
aa7ec7dfc8db: Pushed
f1bf480ea614: Pushed
5c81f9330d99: Layer already exists
40d28832fb02: Pushed
927f9fcef4cf: Layer already exists
3b441d7cb46b: Layer already exists
a81f1846a0d2: Layer already exists
91f7336bbfff: Layer already exists
d3710de04cb3: Layer already exists
e2e8c39e0f77: Layer already exists
14c6fe2aabd9: Pushed
20211208124335: digest: sha256:6fbf7af1c11155cb57821773962348b6b456c2c572fe026b2d5791065f5c7205 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 08, 2021 12:45:30 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 08, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged.
Dec 08, 2021 12:45:31 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 08, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 08, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 08, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
Dec 08, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 08, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <113691 bytes, hash 49687711aa323dd7890ed414b0ec0b8666e3f6958cb1fe9bf1a0f53d929cd00c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SWh3EaoyPdeJDtQUsOwLhmbj9pWMsf6b8aD1PZKc0Aw.pb
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 08, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d8ab698, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ed91d8d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@446626a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86]
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 08, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4]
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 08, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-08_04_45_36-18125747131358354902?project=apache-beam-testing
Dec 08, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-08_04_45_36-18125747131358354902
Dec 08, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-08_04_45_36-18125747131358354902
Dec 08, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-08T12:45:54.921Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-287c. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:00.429Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.193Z: Expanding SplittableParDo operations into optimizable parts.
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.227Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.323Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.411Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.432Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.492Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.625Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.661Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.692Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.751Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.783Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.814Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.846Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.880Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.912Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.944Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:01.993Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.036Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 08, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.075Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.111Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.160Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.196Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.233Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.267Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.299Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.332Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.359Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.412Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.446Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:02.873Z: Starting 5 ****s in us-central1-a...
Dec 08, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:25.381Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 08, 2021 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:46:51.422Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 08, 2021 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:47:47.797Z: Workers have started successfully.
Dec 08, 2021 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T12:47:47.829Z: Workers have started successfully.
Dec 08, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T16:00:36.671Z: Cancel request is committed for workflow job: 2021-12-08_04_45_36-18125747131358354902.
Dec 08, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T16:00:36.788Z: Cleaning up.
Dec 08, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T16:00:36.862Z: Stopping **** pool...
Dec 08, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T16:00:36.939Z: Stopping **** pool...
Dec 08, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T16:03:01.170Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 08, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-08T16:03:01.210Z: Worker pool stopped.
Dec 08, 2021 4:03:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-08_04_45_36-18125747131358354902 finished with status CANCELLED.
Load test results for test (ID): 33695c4a-5fe0-4038-aae3-07348008d0fa and timestamp: 2021-12-08T12:45:30.982000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11525.754
dataflow_v2_java11_total_bytes_count             2.08982509E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211208124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fbf7af1c11155cb57821773962348b6b456c2c572fe026b2d5791065f5c7205
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211208124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fbf7af1c11155cb57821773962348b6b456c2c572fe026b2d5791065f5c7205]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211208124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fbf7af1c11155cb57821773962348b6b456c2c572fe026b2d5791065f5c7205])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fbf7af1c11155cb57821773962348b6b456c2c572fe026b2d5791065f5c7205
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fbf7af1c11155cb57821773962348b6b456c2c572fe026b2d5791065f5c7205
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6fbf7af1c11155cb57821773962348b6b456c2c572fe026b2d5791065f5c7205].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 49s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...

Publishing failed.

The build scan server appears to be unavailable.
Please check https://status.gradle.com for the latest service status.

If the service is reported as available, please report this problem via https://gradle.com/help/plugin and include the following via copy/paste:

----------
Gradle version: 6.9.1
Plugin version: 3.4.1
Request URL: https://status.gradle.com
Request ID: 7032d7e9-65d9-4731-bfdd-41ffe1371627
Response status code: 405
Response server type: Varnish
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #173

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/173/display/redirect?page=changes>

Changes:

[Daniel Oliveira] Minor Gradle Fixups: Removing unneeded dependency in expansion service.

[aydar.zaynutdinov] [BEAM-13363][Playground]

[aydar.zaynutdinov] [BEAM-13357][Playground]

[aydar.zaynutdinov] [BEAM-13357][Playground]

[Luke Cwik] [BEAM-13015] Simplify fake state API testing client and make it

[zyichi] [BEAM-13380] Fix broken seedjob.

[aydar.zaynutdinov] [BEAM-13357][Playground]

[noreply] Merge pull request #16137 from [BEAM-13381][Playground][Bugfix] Update

[noreply] Merge pull request #16107 from [BEAM-13332] [Playground] Add logs output

[noreply] Merge pull request #16109 from [BEAM-13330][Playground] Save Java logs

[noreply] Merge pull request #16125 from [BEAM-13322][Playground] Support Java

[melissapa] [BEAM-11758] Update basics page: Window, Watermark


------------------------------------------
[...truncated 48.44 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
271c32230488: Preparing
726a61695ab5: Preparing
9a1d1db61f48: Preparing
ececa2c8bc7a: Preparing
e29ba269002e: Preparing
c1a2f2951374: Preparing
3b2ceb1f7cca: Preparing
c2c0a62080b9: Preparing
c1268161e168: Preparing
b2e8fca4b0ea: Preparing
c1b9b416beeb: Preparing
8b5c1dd46102: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
c1a2f2951374: Waiting
3b2ceb1f7cca: Waiting
3b441d7cb46b: Waiting
b2e8fca4b0ea: Waiting
927f9fcef4cf: Waiting
a81f1846a0d2: Waiting
8b5c1dd46102: Waiting
c1b9b416beeb: Waiting
d3710de04cb3: Waiting
c1268161e168: Waiting
e2e8c39e0f77: Waiting
5c81f9330d99: Waiting
c2c0a62080b9: Waiting
e29ba269002e: Pushed
9a1d1db61f48: Pushed
726a61695ab5: Pushed
ececa2c8bc7a: Pushed
c2c0a62080b9: Pushed
c1a2f2951374: Pushed
271c32230488: Pushed
3b2ceb1f7cca: Pushed
8b5c1dd46102: Pushed
5c81f9330d99: Layer already exists
c1b9b416beeb: Pushed
c1268161e168: Pushed
3b441d7cb46b: Layer already exists
927f9fcef4cf: Layer already exists
d3710de04cb3: Layer already exists
e2e8c39e0f77: Layer already exists
91f7336bbfff: Layer already exists
a81f1846a0d2: Layer already exists
b2e8fca4b0ea: Pushed
20211207124330: digest: sha256:c57efefbfb8148143d6649d313cfe5976a74525602f128add8238605ea64fcdf size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 07, 2021 12:45:23 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 07, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Dec 07, 2021 12:45:24 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 07, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 07, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 07, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Dec 07, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 07, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111769 bytes, hash a233856ecb74030dd04be2b5678462ed1d5deb20cecf53897c3ba34cd21372e5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ojOFbst0Aw3QS-K1Z4Ri7R1d6yDOz1OJfDujTNITcuU.pb
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 07, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@480ad82c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d18b73a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e]
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 07, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58437801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76464795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b34832b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7]
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 07, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-07_04_45_29-6626945001967699222?project=apache-beam-testing
Dec 07, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-07_04_45_29-6626945001967699222
Dec 07, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-07_04_45_29-6626945001967699222
Dec 07, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-07T12:45:38.017Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-sx2g. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 07, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:45.494Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.202Z: Expanding SplittableParDo operations into optimizable parts.
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.240Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.312Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.374Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.407Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.470Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.553Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.576Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.602Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.647Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.675Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.701Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.724Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.748Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.778Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.811Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.836Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.867Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.911Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.943Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:46.984Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.007Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.044Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.091Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.119Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.145Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.167Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.240Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:47.276Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 07, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:48.389Z: Starting 5 ****s in us-central1-a...
Dec 07, 2021 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:45:59.126Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 07, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:46:32.753Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 07, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:47:29.975Z: Workers have started successfully.
Dec 07, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T12:47:30.004Z: Workers have started successfully.
Dec 07, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T16:00:37.032Z: Cancel request is committed for workflow job: 2021-12-07_04_45_29-6626945001967699222.
Dec 07, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T16:00:37.695Z: Cleaning up.
Dec 07, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T16:00:37.793Z: Stopping **** pool...
Dec 07, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T16:00:37.872Z: Stopping **** pool...
Dec 07, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T16:02:57.835Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 07, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-07T16:02:57.865Z: Worker pool stopped.
Dec 07, 2021 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-07_04_45_29-6626945001967699222 finished with status CANCELLED.
Load test results for test (ID): dd30530b-7af9-4c29-b78e-45b2439a0ef0 and timestamp: 2021-12-07T12:45:24.625000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11547.852
dataflow_v2_java11_total_bytes_count             1.40517518E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211207124330
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c57efefbfb8148143d6649d313cfe5976a74525602f128add8238605ea64fcdf
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211207124330]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c57efefbfb8148143d6649d313cfe5976a74525602f128add8238605ea64fcdf]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211207124330] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c57efefbfb8148143d6649d313cfe5976a74525602f128add8238605ea64fcdf])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c57efefbfb8148143d6649d313cfe5976a74525602f128add8238605ea64fcdf
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c57efefbfb8148143d6649d313cfe5976a74525602f128add8238605ea64fcdf
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c57efefbfb8148143d6649d313cfe5976a74525602f128add8238605ea64fcdf].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 51s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tq5dwqa5ycmgy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #172

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/172/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16118 from [BEAM-13362][Playground] python ci cd


------------------------------------------
[...truncated 49.22 KB...]
97ff91347834: Preparing
9b06cfb03e9d: Preparing
9663fdc07379: Preparing
4496f816e1b9: Preparing
0e80a617bb77: Preparing
47924046174a: Preparing
745b84e806b4: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
97ff91347834: Waiting
e2e8c39e0f77: Preparing
4496f816e1b9: Waiting
0e80a617bb77: Waiting
47924046174a: Waiting
9b06cfb03e9d: Waiting
745b84e806b4: Waiting
9663fdc07379: Waiting
5c81f9330d99: Waiting
d3710de04cb3: Waiting
91f7336bbfff: Waiting
a81f1846a0d2: Waiting
3b441d7cb46b: Waiting
e2e8c39e0f77: Waiting
b6f5d0825c23: Pushed
7bf7eab0c69d: Pushed
f00185611714: Pushed
9afec5a4e288: Pushed
6a0c9ee8153e: Pushed
97ff91347834: Pushed
9663fdc07379: Pushed
4496f816e1b9: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
47924046174a: Pushed
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
9b06cfb03e9d: Pushed
745b84e806b4: Pushed
0e80a617bb77: Pushed
20211206124332: digest: sha256:572b1a30b1310735b262a0c8f30816ee780cc6e86400786523db0855a3b54617 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 06, 2021 12:45:20 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 06, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Dec 06, 2021 12:45:21 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 06, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 06, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 06, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Dec 06, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 06, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111769 bytes, hash 6a52a15ac2806921d65edc8b2b04c35f27c179449ed75e09b1e568b06cff53d2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-alKhWsKAaSHWXtyLKwTDXyfBeUSe114JseVosGz_U9I.pb
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 06, 2021 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@480ad82c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d18b73a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e]
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 06, 2021 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58437801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76464795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b34832b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7]
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 06, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 06, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-06_04_45_26-1700801981759437675?project=apache-beam-testing
Dec 06, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-06_04_45_26-1700801981759437675
Dec 06, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-06_04_45_26-1700801981759437675
Dec 06, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-06T12:45:33.536Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-jbsc. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:38.362Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.037Z: Expanding SplittableParDo operations into optimizable parts.
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.068Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.117Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.178Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.207Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.275Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.393Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.418Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.449Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.483Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.506Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.534Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.563Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.598Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.631Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.654Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.681Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.703Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.734Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.769Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.804Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.838Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.873Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.907Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.942Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.969Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:39.999Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:40.037Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:40.068Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 06, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:40.385Z: Starting 5 ****s in us-central1-a...
Dec 06, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:45:57.339Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 06, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:46:25.054Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 06, 2021 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:47:23.696Z: Workers have started successfully.
Dec 06, 2021 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T12:47:23.742Z: Workers have started successfully.
Dec 06, 2021 3:57:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-06T15:57:41.624Z: Staged package classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar' is inaccessible.
Dec 06, 2021 3:57:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-06T15:57:43.985Z: Staged package junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar' is inaccessible.
Dec 06, 2021 3:57:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-06T15:57:44.980Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T16:00:36.710Z: Cancel request is committed for workflow job: 2021-12-06_04_45_26-1700801981759437675.
Dec 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T16:00:36.754Z: Cleaning up.
Dec 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T16:00:36.829Z: Stopping **** pool...
Dec 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T16:00:36.895Z: Stopping **** pool...
Dec 06, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T16:03:01.116Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 06, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-06T16:03:01.146Z: Worker pool stopped.
Dec 06, 2021 4:03:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-06_04_45_26-1700801981759437675 finished with status CANCELLED.
Load test results for test (ID): 573eb212-12de-4267-8180-bb1d133d9b99 and timestamp: 2021-12-06T12:45:20.917000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11562.718
dataflow_v2_java11_total_bytes_count             1.50949697E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211206124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:572b1a30b1310735b262a0c8f30816ee780cc6e86400786523db0855a3b54617
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211206124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:572b1a30b1310735b262a0c8f30816ee780cc6e86400786523db0855a3b54617]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211206124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:572b1a30b1310735b262a0c8f30816ee780cc6e86400786523db0855a3b54617])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:572b1a30b1310735b262a0c8f30816ee780cc6e86400786523db0855a3b54617
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:572b1a30b1310735b262a0c8f30816ee780cc6e86400786523db0855a3b54617
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:572b1a30b1310735b262a0c8f30816ee780cc6e86400786523db0855a3b54617].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 50s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pwrhl75m3uyru

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #171

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/171/display/redirect>

Changes:


------------------------------------------
[...truncated 48.72 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
8f591258c8cc: Preparing
b7ee6d446b95: Preparing
4de9b9373a95: Preparing
2efcee59fe64: Preparing
83384815c3ff: Preparing
698f82620171: Preparing
da02b3bc4ded: Preparing
5b928413235f: Preparing
1cbb5ccdd4b0: Preparing
bee9533ee01d: Preparing
e62ec33cc1b2: Preparing
f92d8dd645b0: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
f92d8dd645b0: Waiting
d3710de04cb3: Waiting
5c81f9330d99: Waiting
927f9fcef4cf: Waiting
91f7336bbfff: Waiting
e2e8c39e0f77: Waiting
a81f1846a0d2: Waiting
5b928413235f: Waiting
3b441d7cb46b: Waiting
1cbb5ccdd4b0: Waiting
698f82620171: Waiting
da02b3bc4ded: Waiting
bee9533ee01d: Waiting
4de9b9373a95: Pushed
b7ee6d446b95: Pushed
83384815c3ff: Pushed
8f591258c8cc: Pushed
698f82620171: Pushed
2efcee59fe64: Pushed
5b928413235f: Pushed
1cbb5ccdd4b0: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
e62ec33cc1b2: Pushed
3b441d7cb46b: Layer already exists
a81f1846a0d2: Layer already exists
91f7336bbfff: Layer already exists
d3710de04cb3: Layer already exists
e2e8c39e0f77: Layer already exists
da02b3bc4ded: Pushed
f92d8dd645b0: Pushed
bee9533ee01d: Pushed
20211205124332: digest: sha256:07aefa9e0f355d4d4ef54aba3a76b22efc4eb6008d0187d37735b8f8ba6c9163 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 05, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Dec 05, 2021 12:45:37 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 05, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 05, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 05, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Dec 05, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 05, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111769 bytes, hash 9816ff88d3f4a30c7523df597543fda83e0690eff08f5abf1e53b67516fc7bdf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mBb_iNP0owx1I99ZdUP9qD4GkO_wj1q_HlO2dRb8e98.pb
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 05, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d18b73a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f]
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 05, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76464795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b34832b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70]
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 05, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-05_04_45_42-14422065288940656869?project=apache-beam-testing
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-05_04_45_42-14422065288940656869
Dec 05, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-05_04_45_42-14422065288940656869
Dec 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-05T12:45:49.356Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-n9e9. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 05, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:55.458Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 05, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.097Z: Expanding SplittableParDo operations into optimizable parts.
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.131Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.197Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.272Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.309Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.377Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.477Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.512Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.544Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.576Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.610Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.635Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.670Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.704Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.736Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.770Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.796Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.829Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.862Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.893Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.927Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:56.972Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.014Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.040Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.074Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.108Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.143Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.171Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.203Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:45:57.549Z: Starting 5 ****s in us-central1-a...
Dec 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:46:12.165Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 05, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:46:37.046Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 05, 2021 12:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:47:36.725Z: Workers have started successfully.
Dec 05, 2021 12:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T12:47:36.755Z: Workers have started successfully.
Dec 05, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T16:00:33.271Z: Cancel request is committed for workflow job: 2021-12-05_04_45_42-14422065288940656869.
Dec 05, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T16:00:33.347Z: Cleaning up.
Dec 05, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T16:00:33.427Z: Stopping **** pool...
Dec 05, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T16:00:33.469Z: Stopping **** pool...
Dec 05, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T16:02:51.090Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 05, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-05T16:02:51.132Z: Worker pool stopped.
Dec 05, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-05_04_45_42-14422065288940656869 finished with status CANCELLED.
Load test results for test (ID): e9f64209-708c-43ba-bd02-87b35c43f3f8 and timestamp: 2021-12-05T12:45:37.387000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11543.136
dataflow_v2_java11_total_bytes_count             2.53889161E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211205124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:07aefa9e0f355d4d4ef54aba3a76b22efc4eb6008d0187d37735b8f8ba6c9163
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211205124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:07aefa9e0f355d4d4ef54aba3a76b22efc4eb6008d0187d37735b8f8ba6c9163]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211205124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:07aefa9e0f355d4d4ef54aba3a76b22efc4eb6008d0187d37735b8f8ba6c9163])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:07aefa9e0f355d4d4ef54aba3a76b22efc4eb6008d0187d37735b8f8ba6c9163
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:07aefa9e0f355d4d4ef54aba3a76b22efc4eb6008d0187d37735b8f8ba6c9163
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:07aefa9e0f355d4d4ef54aba3a76b22efc4eb6008d0187d37735b8f8ba6c9163].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 43s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ghja2lih3zvok

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #170

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/170/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Revert "Revert "Allow wildcards for java class lookup transform

[Robert Bradshaw] Move stand-alone expansion service jar into its own project.

[noreply] Merge pull request #16117 from [BEAM-13368][Playground][Bugfix] Fix CI

[noreply] Run python SpannerIO IT with python 3.7 only to avoid overload spanner

[noreply] Merge pull request #15378 from [RFC] Define and document per-key

[noreply] Bump python containers to beam-master-20211202 (#16129)

[noreply] [BEAM-13354, BEAM-13015, BEAM-12802, BEAM-12588] Support prefetch for

[zyichi] Fix failing RecommendationAICatalogItemIT

[Valentyn Tymofieiev] Clarify instructions on how to get contributor list.

[noreply] [BEAM-11936] Fix errorprone UnusedVariable in IO (#16036)


------------------------------------------
[...truncated 49.09 KB...]
00126f09d5b7: Preparing
fd2c289dd740: Preparing
9437b340a45c: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
fd2c289dd740: Waiting
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
a0b0581a19cd: Waiting
89f7638f548c: Waiting
3b441d7cb46b: Waiting
9437b340a45c: Waiting
00126f09d5b7: Waiting
91f7336bbfff: Waiting
927f9fcef4cf: Waiting
d3710de04cb3: Waiting
e2e8c39e0f77: Waiting
5c81f9330d99: Waiting
91ab7bf1ec06: Pushed
a82ba1ba0f48: Pushed
1a9b9d06b27c: Pushed
499c60a48ffa: Pushed
19a3cff3639a: Pushed
af1cb12c8718: Pushed
a0b0581a19cd: Pushed
89f7638f548c: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
fd2c289dd740: Pushed
751487c16c82: Pushed
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
9437b340a45c: Pushed
00126f09d5b7: Pushed
20211204124331: digest: sha256:dc8c982f3bb510c2c8e1501a52cfad9709e289905da50aa6caa13a215156f720 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 04, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 04, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Dec 04, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 04, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 04, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 04, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Dec 04, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 04, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111769 bytes, hash f78bcdd0db7651084d4df8e7ec462faabf4834fa6e16411a39f2abb2c2f20e57> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-94vN0Nt2UQhNTfjn7EYvqr9INPpuFkEaOfKrssLyDlc.pb
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 04, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53d13cd4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77865933, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@480ad82c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d18b73a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a]
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 04, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5399f6c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fe64d23, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58437801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76464795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b34832b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7]
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 04, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 04, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-04_04_45_28-4642740283095629077?project=apache-beam-testing
Dec 04, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-04_04_45_28-4642740283095629077
Dec 04, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-04_04_45_28-4642740283095629077
Dec 04, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-04T12:45:35.313Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-jiav. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 04, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:38.926Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 04, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.592Z: Expanding SplittableParDo operations into optimizable parts.
Dec 04, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.630Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 04, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.718Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 04, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.790Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 04, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.817Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 04, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.869Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.961Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:39.992Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.026Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.049Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.082Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.118Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.144Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.174Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.199Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.226Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.251Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.274Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.295Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.334Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.364Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.390Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.416Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.457Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.485Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.511Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.533Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.565Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.591Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 04, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:45:40.923Z: Starting 5 ****s in us-central1-a...
Dec 04, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:46:08.169Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 04, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:46:30.993Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 04, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:47:26.331Z: Workers have started successfully.
Dec 04, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T12:47:26.355Z: Workers have started successfully.
Dec 04, 2021 1:54:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-04T13:54:41.130Z: Staged package amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar' is inaccessible.
Dec 04, 2021 1:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-04T13:54:44.602Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 04, 2021 1:57:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-04T13:57:44.299Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 04, 2021 2:00:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-04T14:00:41.124Z: Staged package amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar' is inaccessible.
Dec 04, 2021 2:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-04T14:00:44.928Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 04, 2021 2:03:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-04T14:03:44.111Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 04, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T16:00:31.584Z: Cancel request is committed for workflow job: 2021-12-04_04_45_28-4642740283095629077.
Dec 04, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T16:00:31.617Z: Cleaning up.
Dec 04, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T16:00:31.686Z: Stopping **** pool...
Dec 04, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T16:00:31.750Z: Stopping **** pool...
Dec 04, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T16:02:55.163Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 04, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-04T16:02:55.197Z: Worker pool stopped.
Dec 04, 2021 4:03:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-04_04_45_28-4642740283095629077 finished with status CANCELLED.
Load test results for test (ID): 74b0eb54-c82a-4eff-8144-1733ae9f263a and timestamp: 2021-12-04T12:45:22.779000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11503.363
dataflow_v2_java11_total_bytes_count             1.94954657E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211204124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dc8c982f3bb510c2c8e1501a52cfad9709e289905da50aa6caa13a215156f720
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211204124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dc8c982f3bb510c2c8e1501a52cfad9709e289905da50aa6caa13a215156f720]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211204124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dc8c982f3bb510c2c8e1501a52cfad9709e289905da50aa6caa13a215156f720])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dc8c982f3bb510c2c8e1501a52cfad9709e289905da50aa6caa13a215156f720
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dc8c982f3bb510c2c8e1501a52cfad9709e289905da50aa6caa13a215156f720
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dc8c982f3bb510c2c8e1501a52cfad9709e289905da50aa6caa13a215156f720].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 46s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ahs7e476yi63a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #169

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/169/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13329][Playground]

[aydar.zaynutdinov] [BEAM-13329][Playground]

[alexander.zhuravlev] [BEAM-13370] Deleted unused prints & strings

[ilya.kozyrev] Fix pylint issues and apply yapf with Beam config

[ilya.kozyrev] fix white spaces

[noreply] Don't pin a particular version of Tensorflow. (#16102)

[noreply] [BEAM-12733] Fix failing integration tests for Java Recommendation AI

[noreply] [BEAM-13288] improve logging for no rows present error (#16096)


------------------------------------------
[...truncated 48.91 KB...]
e2eb060785c6: Preparing
709a030d6410: Preparing
47ba6c76f0c9: Preparing
c1d696a4b3a1: Preparing
22c136e61d12: Preparing
d0c7bda4e558: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
d0c7bda4e558: Waiting
47ba6c76f0c9: Waiting
d3710de04cb3: Waiting
22c136e61d12: Waiting
91f7336bbfff: Waiting
5c81f9330d99: Waiting
c1d696a4b3a1: Waiting
e2e8c39e0f77: Waiting
3b441d7cb46b: Waiting
a81f1846a0d2: Waiting
709a030d6410: Waiting
a0c9a60fb894: Waiting
191dfd937c71: Pushed
7e9dfde64951: Pushed
a3b0537b01a1: Pushed
247e5c7bec76: Pushed
a0c9a60fb894: Pushed
f53aaaa20428: Pushed
709a030d6410: Pushed
47ba6c76f0c9: Pushed
5c81f9330d99: Layer already exists
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
22c136e61d12: Pushed
e2eb060785c6: Pushed
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
d0c7bda4e558: Pushed
c1d696a4b3a1: Pushed
20211203124333: digest: sha256:32715126f0b40ff91a6606824fd709ff77d81b6a11c6dc08f9908833c466fff7 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 03, 2021 12:45:35 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 03, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Dec 03, 2021 12:45:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 03, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 03, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 03, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Dec 03, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 03, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111769 bytes, hash bce5e84e0c79f033e20e0b1e119be0d3d85385cdefafa9adadb64b1c121b5982> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vOXoTgx58DPiDgseEZvg09hThc3vr6mtrbZLHBIbWYI.pb
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 03, 2021 12:45:40 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53d13cd4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77865933, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@480ad82c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d18b73a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a]
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 03, 2021 12:45:40 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5399f6c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fe64d23, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58437801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76464795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b34832b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7]
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 03, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-03_04_45_40-9024301071661498886?project=apache-beam-testing
Dec 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-03_04_45_40-9024301071661498886
Dec 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-03_04_45_40-9024301071661498886
Dec 03, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-03T12:45:48.216Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-b4fr. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:52.848Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.407Z: Expanding SplittableParDo operations into optimizable parts.
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.437Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.514Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.573Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.679Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.728Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.832Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.859Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.892Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.924Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.959Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:53.990Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.015Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.050Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.097Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.121Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.155Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.202Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.240Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.274Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.303Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.331Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.367Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.399Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.423Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.478Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.504Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.527Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.561Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:45:54.864Z: Starting 5 ****s in us-central1-a...
Dec 03, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:46:24.077Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 03, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:46:39.260Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 03, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:47:37.519Z: Workers have started successfully.
Dec 03, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T12:47:37.549Z: Workers have started successfully.
Dec 03, 2021 2:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-03T14:45:55.930Z: Staged package beam-sdks-java-io-kinesis-2.36.0-SNAPSHOT-dHp3AUpnCyW8BI5_dm34xo4w2zHWNjlCUzZMmsdn-aM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-kinesis-2.36.0-SNAPSHOT-dHp3AUpnCyW8BI5_dm34xo4w2zHWNjlCUzZMmsdn-aM.jar' is inaccessible.
Dec 03, 2021 2:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-03T14:45:56.068Z: Staged package beam-sdks-java-load-tests-2.36.0-SNAPSHOT-AWXw3244IpmjjLbDXVk6FhB1JxiEeTDrXIDH5IPOx9U.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-load-tests-2.36.0-SNAPSHOT-AWXw3244IpmjjLbDXVk6FhB1JxiEeTDrXIDH5IPOx9U.jar' is inaccessible.
Dec 03, 2021 2:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-12-03T14:45:58.172Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Dec 03, 2021 2:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-03T14:45:59.594Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Dec 03, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T16:00:36.377Z: Cancel request is committed for workflow job: 2021-12-03_04_45_40-9024301071661498886.
Dec 03, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T16:00:36.420Z: Cleaning up.
Dec 03, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T16:00:36.519Z: Stopping **** pool...
Dec 03, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T16:00:36.589Z: Stopping **** pool...
Dec 03, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T16:02:59.353Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 03, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-03T16:02:59.381Z: Worker pool stopped.
Dec 03, 2021 4:03:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-03_04_45_40-9024301071661498886 finished with status CANCELLED.
Load test results for test (ID): 2baf55fd-dba5-4a68-95ea-bdae2153e17a and timestamp: 2021-12-03T12:45:35.937000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11541.252
dataflow_v2_java11_total_bytes_count             1.90487712E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211203124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:32715126f0b40ff91a6606824fd709ff77d81b6a11c6dc08f9908833c466fff7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211203124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:32715126f0b40ff91a6606824fd709ff77d81b6a11c6dc08f9908833c466fff7]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211203124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:32715126f0b40ff91a6606824fd709ff77d81b6a11c6dc08f9908833c466fff7])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:32715126f0b40ff91a6606824fd709ff77d81b6a11c6dc08f9908833c466fff7
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:32715126f0b40ff91a6606824fd709ff77d81b6a11c6dc08f9908833c466fff7
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:32715126f0b40ff91a6606824fd709ff77d81b6a11c6dc08f9908833c466fff7].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 49s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ws6ap6y5zswgm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #168

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/168/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Require explicit designation of proxy partitioning specification.

[Robert Bradshaw] Elementwise transforms preserve all partitionings.

[aydar.zaynutdinov] [BEAM-13360][Playground]

[noreply] Merge pull request #16011 from [BEAM-12164] Add models for Spanner

[noreply] [BEAM-13335] Use signed int64 range for DataFrame read unique indexes

[iyi] Add test case for untriggered expansion with temp tables.

[zyichi] Fix small typo in ProcessBundleHandlerTest.java

[noreply] Merge pull request #16079 from [BEAM-13241] [Playground] Frontend

[noreply] Ricardo Case Study (#16087)

[noreply] Merge pull request #16086 from [BEAM-13258][Playground] Get and store

[noreply] [BEAM-11936] Fix errorprone UnusedVariable in Runners (#16035)

[noreply] [BEAM-8123] Add cloudpickle as optional library (#15472)

[noreply] [BEAM-12572] Beam python examples continuously exercised on at least 2


------------------------------------------
[...truncated 48.95 KB...]

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
c862c4aeb96b: Preparing
d61fa9a6824b: Preparing
691c0fbd62ab: Preparing
7127b4d552e0: Preparing
860d21c74383: Preparing
0d0f6a907076: Preparing
1bd26ae5df9b: Preparing
3678452d6023: Preparing
70d2d983890a: Preparing
4eebf714a030: Preparing
348562b66da8: Preparing
199c65c9b847: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
a81f1846a0d2: Preparing
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
199c65c9b847: Waiting
1bd26ae5df9b: Waiting
d3710de04cb3: Waiting
91f7336bbfff: Waiting
5c81f9330d99: Waiting
3678452d6023: Waiting
e2e8c39e0f77: Waiting
70d2d983890a: Waiting
927f9fcef4cf: Waiting
4eebf714a030: Waiting
a81f1846a0d2: Waiting
0d0f6a907076: Waiting
860d21c74383: Pushed
691c0fbd62ab: Pushed
d61fa9a6824b: Pushed
c862c4aeb96b: Pushed
0d0f6a907076: Pushed
7127b4d552e0: Pushed
3678452d6023: Pushed
70d2d983890a: Pushed
5c81f9330d99: Layer already exists
199c65c9b847: Pushed
1bd26ae5df9b: Pushed
348562b66da8: Pushed
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
4eebf714a030: Pushed
20211202124333: digest: sha256:f4dd28e5ce64b7d74204df409ea446fae9e268806ccceae69cb903a1a744811a size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 02, 2021 12:45:48 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 02, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Dec 02, 2021 12:45:49 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 02, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 02, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 02, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Dec 02, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 02, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111769 bytes, hash 616e89017161d40fdcc1b931fe819a245586534650ee036fc93130a830296f3c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YW6JAXFh1A_cwbkx_oGaJFWGU0ZQ7gNvyTEwqDApbzw.pb
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 02, 2021 12:45:54 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77865933, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@480ad82c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d18b73a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86]
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 02, 2021 12:45:54 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fe64d23, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58437801, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6af5bbd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76464795, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b34832b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20]
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 02, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-02_04_45_54-11669014248312983346?project=apache-beam-testing
Dec 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-02_04_45_54-11669014248312983346
Dec 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-02_04_45_54-11669014248312983346
Dec 02, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-02T12:46:01.070Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-5viu. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 02, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:05.318Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:05.838Z: Expanding SplittableParDo operations into optimizable parts.
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:05.864Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:05.958Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.041Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.068Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.137Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.245Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.286Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.308Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.343Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.376Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.413Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.447Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.497Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.533Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.568Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.600Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 02, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.636Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.671Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.695Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.718Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.744Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.782Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.824Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.853Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.887Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.923Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.958Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:06.991Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 02, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:07.349Z: Starting 5 ****s in us-central1-a...
Dec 02, 2021 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:16.735Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 02, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:46:54.900Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 02, 2021 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:47:54.497Z: Workers have started successfully.
Dec 02, 2021 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T12:47:54.541Z: Workers have started successfully.
Dec 02, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T16:00:35.714Z: Cancel request is committed for workflow job: 2021-12-02_04_45_54-11669014248312983346.
Dec 02, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T16:00:35.788Z: Cleaning up.
Dec 02, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T16:00:35.862Z: Stopping **** pool...
Dec 02, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T16:00:35.915Z: Stopping **** pool...
Dec 02, 2021 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T16:02:55.359Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 02, 2021 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-02T16:02:55.397Z: Worker pool stopped.
Dec 02, 2021 4:03:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-02_04_45_54-11669014248312983346 finished with status CANCELLED.
Load test results for test (ID): 50148b1c-14be-4ae4-8a3a-291c3ce095a6 and timestamp: 2021-12-02T12:45:48.814000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11528.599
dataflow_v2_java11_total_bytes_count             2.10152696E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211202124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4dd28e5ce64b7d74204df409ea446fae9e268806ccceae69cb903a1a744811a
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211202124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4dd28e5ce64b7d74204df409ea446fae9e268806ccceae69cb903a1a744811a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211202124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4dd28e5ce64b7d74204df409ea446fae9e268806ccceae69cb903a1a744811a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4dd28e5ce64b7d74204df409ea446fae9e268806ccceae69cb903a1a744811a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4dd28e5ce64b7d74204df409ea446fae9e268806ccceae69cb903a1a744811a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f4dd28e5ce64b7d74204df409ea446fae9e268806ccceae69cb903a1a744811a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ulfhgwrtuua4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #167

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/167/display/redirect?page=changes>

Changes:

[mmack] [BEAM-11494][BEAM-11821] FileIO stops overwriting files on retries (AWS

[Pablo Estrada] [py] Supporting ignore_unknown_values for WriteToBigQuery

[noreply] Merge pull request #16072 from [BEAM-13135][Playground] Add function to

[noreply] Merge pull request #16084 from [BEAM-13347][Playground] [Bugfix] Backend

[noreply] Merge pull request #15987 from [BEAM-13242] Allow values with smaller

[noreply] [BEAM-11936] Fix errorprone UnusedVariable in Extensions (#16033)

[noreply] Merge pull request #15936 from [BEAM-13351] FhirIO GetPatientEverything

[avilovpavel6] Add fmt preparator for go sdk

[noreply] Merge pull request #16073 from [BEAM-13267][Playground] Implement

[ningkang0957] Updated screen diff test for Interactive Beam

[noreply] [BEAM-13282][Playground] Create go dockerfile #16049

[noreply] Merge pull request #16085 from [Beam-13336][Playground] Refactor

[noreply] [BEAM-11936] Fix errorprone UnusedVariable in Extensions-sql (#16034)

[noreply] [BEAM-13193] Support

[noreply] Merge pull request #16090: [BEAM-13352] Add transient qualifier to


------------------------------------------
[...truncated 48.65 KB...]
348f9624d5ed: Preparing
f0c0b4242b1f: Preparing
d88332d7efe5: Preparing
b0ccf29f0a00: Preparing
53ed7ada91af: Preparing
abb00800e18b: Preparing
311cde9b71dd: Preparing
d479baaba261: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
ab9d251e27cb: Waiting
d479baaba261: Waiting
d88332d7efe5: Waiting
8a5844586fdb: Waiting
47ee2d19f81a: Waiting
b0ccf29f0a00: Waiting
a4aba4e59b40: Waiting
abb00800e18b: Waiting
f0c0b4242b1f: Waiting
53ed7ada91af: Waiting
a9e4c9343539: Waiting
311cde9b71dd: Waiting
5499f2905579: Waiting
a36ba9e322f7: Waiting
9d7b54cf0721: Pushed
348f9624d5ed: Pushed
110d54eba4c1: Pushed
f0c0b4242b1f: Pushed
6164f9663732: Pushed
2a7497ef9079: Pushed
b0ccf29f0a00: Pushed
53ed7ada91af: Pushed
a9e4c9343539: Layer already exists
311cde9b71dd: Pushed
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
d88332d7efe5: Pushed
d479baaba261: Pushed
abb00800e18b: Pushed
20211201124336: digest: sha256:7842836cfde2dc38a12000047f23497aa28d5ec4c47954f0d2ab50d0c5e5ceb5 size: 4311

> Task :sdks:java:testing:load-tests:run
Dec 01, 2021 12:45:51 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 01, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Dec 01, 2021 12:45:52 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 01, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Dec 01, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Dec 01, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Dec 01, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 01, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111769 bytes, hash a9885a91185c78b8becce7d562022023aedcee54516191af986e1c17ca34cef0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-qYhakRhceLi-zOfVYgIgI67c7lRRYZGvmG4cF8o0zvA.pb
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Dec 01, 2021 12:45:57 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe]
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Dec 01, 2021 12:45:57 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17]
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Dec 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 01, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-01_04_45_57-7573785157800842187?project=apache-beam-testing
Dec 01, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-01_04_45_57-7573785157800842187
Dec 01, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-01_04_45_57-7573785157800842187
Dec 01, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-01T12:46:03.096Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-11lj. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 01, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:08.050Z: Worker configuration: e2-standard-2 in us-central1-a.
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:08.830Z: Expanding SplittableParDo operations into optimizable parts.
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:08.861Z: Expanding CollectionToSingleton operations into optimizable parts.
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:08.938Z: Expanding CoGroupByKey operations into optimizable parts.
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.002Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.037Z: Expanding GroupByKey operations into streaming Read/Write steps
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.090Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.197Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.220Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.251Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.284Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.317Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.351Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.375Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.400Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.421Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.464Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.489Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.516Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.548Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.573Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.593Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.626Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.651Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.678Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.711Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.750Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.777Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.806Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:09.832Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 01, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:10.161Z: Starting 5 ****s in us-central1-a...
Dec 01, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:42.377Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 01, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:54.797Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 01, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:46:54.826Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Dec 01, 2021 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:47:05.071Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 01, 2021 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:47:58.150Z: Workers have started successfully.
Dec 01, 2021 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T12:47:58.178Z: Workers have started successfully.
Dec 01, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T16:00:41.723Z: Cancel request is committed for workflow job: 2021-12-01_04_45_57-7573785157800842187.
Dec 01, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T16:00:41.828Z: Cleaning up.
Dec 01, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T16:00:41.894Z: Stopping **** pool...
Dec 01, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T16:00:41.938Z: Stopping **** pool...
Dec 01, 2021 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T16:03:05.110Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Dec 01, 2021 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-01T16:03:05.143Z: Worker pool stopped.
Dec 01, 2021 4:03:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-12-01_04_45_57-7573785157800842187 finished with status CANCELLED.
Load test results for test (ID): 9214887a-ec4e-4e4c-ac1e-69a0c7f71c17 and timestamp: 2021-12-01T12:45:52.514000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11539.196
dataflow_v2_java11_total_bytes_count             1.48836149E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211201124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7842836cfde2dc38a12000047f23497aa28d5ec4c47954f0d2ab50d0c5e5ceb5
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211201124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7842836cfde2dc38a12000047f23497aa28d5ec4c47954f0d2ab50d0c5e5ceb5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211201124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7842836cfde2dc38a12000047f23497aa28d5ec4c47954f0d2ab50d0c5e5ceb5])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7842836cfde2dc38a12000047f23497aa28d5ec4c47954f0d2ab50d0c5e5ceb5
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7842836cfde2dc38a12000047f23497aa28d5ec4c47954f0d2ab50d0c5e5ceb5
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7842836cfde2dc38a12000047f23497aa28d5ec4c47954f0d2ab50d0c5e5ceb5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 57s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mlunobhdvfsf6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #166

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/166/display/redirect?page=changes>

Changes:

[kileysok] Increase timeout for Dataflow Streaming VR

[Kyle Weaver] [BEAM-13220] Update release instructions.

[noreply] Merge pull request #16050 from [BEAM-13307][Playground] Support Python

[noreply] Merge pull request #16071 from [BEAM-13306][Playground] Add using of

[noreply] [BEAM-11943] Fix wrong Klio logo being displayed (#16078)

[noreply] Merge pull request #15954 from [BEAM-960][BEAM-1675] Improvements to

[noreply] [BEAM-13284] Respect expiration time for all Redis write methods.

[noreply] [BEAM-12697] Add SBE module and initial classes (#15733)

[noreply] [Playground][Beam-13316][Bugfix] Fix Playground Frontend Precommit

[Kyle Weaver] [BEAM-13337] Periodically delete stale spanner databases.

[noreply] [BEAM-13115][Playground] Security – Mock Storage (#16047)

[noreply] [BEAM-13335] Use shorter, numerical indices for dataframe reads.

[noreply] Be more conservative about rebatching. (#16058)

[noreply] [BEAM-12587] Allow None in Python's Any logical type. (#16055)

[melissapa] [BEAM-11758] Update basics page: Trigger, State and timers


------------------------------------------
[...truncated 48.49 KB...]
22cc9d8dc485: Preparing
fd30fb18479b: Preparing
119eb50fa5ed: Preparing
357535433b5f: Preparing
c3b07ef09fc4: Preparing
de1267421cb2: Preparing
49a87df89925: Preparing
b949590960bd: Preparing
452981d78c45: Preparing
01d2ecf6213f: Preparing
d0f0e5accd86: Preparing
37b27ba05de4: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
8a5844586fdb: Waiting
49a87df89925: Waiting
d0f0e5accd86: Waiting
a4aba4e59b40: Waiting
b949590960bd: Waiting
5499f2905579: Waiting
37b27ba05de4: Waiting
452981d78c45: Waiting
a36ba9e322f7: Waiting
a9e4c9343539: Waiting
47ee2d19f81a: Waiting
ab9d251e27cb: Waiting
01d2ecf6213f: Waiting
de1267421cb2: Waiting
119eb50fa5ed: Pushed
fd30fb18479b: Pushed
c3b07ef09fc4: Pushed
de1267421cb2: Pushed
22cc9d8dc485: Pushed
b949590960bd: Pushed
357535433b5f: Pushed
452981d78c45: Pushed
49a87df89925: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
37b27ba05de4: Pushed
d0f0e5accd86: Pushed
01d2ecf6213f: Pushed
20211130124331: digest: sha256:6bff8f3c3ac22df4dbc41278d493852c44ac2c05039e9161a851b0ed33504cb7 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 30, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 30, 2021 12:45:25 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 30, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 30, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 30, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 30, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 30, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash a425a33e3e0a65638cfc02c2a061d7e62c3e97e07fb53fb26a02d975c2fc60bc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pCWjPj4KZWOM_ALCoGHX5iw-l-B_tT-yagLZdcL8YLw.pb
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 30, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6]
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 30, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99]
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 30, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 30, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-30_04_45_30-14949378249761327338?project=apache-beam-testing
Nov 30, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-30_04_45_30-14949378249761327338
Nov 30, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-30_04_45_30-14949378249761327338
Nov 30, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-30T12:45:37.313Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-snhw. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:41.916Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:42.651Z: Expanding SplittableParDo operations into optimizable parts.
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:42.679Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:42.740Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:42.818Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:42.844Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:42.912Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.017Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.070Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.103Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.136Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.158Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.188Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.216Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.253Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.292Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.317Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.339Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.370Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.398Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.427Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.456Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.487Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 30, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.514Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 30, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.540Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 30, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.599Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 30, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.631Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 30, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.661Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 30, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.696Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 30, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:43.726Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 30, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:45:44.110Z: Starting 5 ****s in us-central1-a...
Nov 30, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:46:06.068Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 30, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:46:35.750Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 30, 2021 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:47:30.008Z: Workers have started successfully.
Nov 30, 2021 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T12:47:30.041Z: Workers have started successfully.
Nov 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T16:00:33.918Z: Cancel request is committed for workflow job: 2021-11-30_04_45_30-14949378249761327338.
Nov 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T16:00:33.983Z: Cleaning up.
Nov 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T16:00:34.053Z: Stopping **** pool...
Nov 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T16:00:34.105Z: Stopping **** pool...
Nov 30, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T16:02:48.202Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 30, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-30T16:02:48.242Z: Worker pool stopped.
Nov 30, 2021 4:02:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-30_04_45_30-14949378249761327338 finished with status CANCELLED.
Load test results for test (ID): b5cfb8fa-28a9-4705-89d1-b4863702b70c and timestamp: 2021-11-30T12:45:25.569000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11548.839
dataflow_v2_java11_total_bytes_count             1.80352863E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211130124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6bff8f3c3ac22df4dbc41278d493852c44ac2c05039e9161a851b0ed33504cb7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211130124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6bff8f3c3ac22df4dbc41278d493852c44ac2c05039e9161a851b0ed33504cb7]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211130124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6bff8f3c3ac22df4dbc41278d493852c44ac2c05039e9161a851b0ed33504cb7])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6bff8f3c3ac22df4dbc41278d493852c44ac2c05039e9161a851b0ed33504cb7
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6bff8f3c3ac22df4dbc41278d493852c44ac2c05039e9161a851b0ed33504cb7
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6bff8f3c3ac22df4dbc41278d493852c44ac2c05039e9161a851b0ed33504cb7].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 39s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ajlrw3jjo5cge

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #165

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/165/display/redirect>

Changes:


------------------------------------------
[...truncated 48.47 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
042af599cd41: Preparing
8f5380278bf5: Preparing
09ee2abfddec: Preparing
bfcba587c762: Preparing
db24a6eb056c: Preparing
0b018574049b: Preparing
c64bdd999388: Preparing
4357dbc49323: Preparing
ff3ded16ccfd: Preparing
42dc884742a5: Preparing
3318df26b12a: Preparing
096944600e60: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
c64bdd999388: Waiting
ff3ded16ccfd: Waiting
8a5844586fdb: Waiting
5499f2905579: Waiting
0b018574049b: Waiting
a9e4c9343539: Waiting
47ee2d19f81a: Waiting
4357dbc49323: Waiting
ab9d251e27cb: Waiting
3318df26b12a: Waiting
096944600e60: Waiting
42dc884742a5: Waiting
a4aba4e59b40: Waiting
09ee2abfddec: Pushed
db24a6eb056c: Pushed
8f5380278bf5: Pushed
042af599cd41: Pushed
0b018574049b: Pushed
bfcba587c762: Pushed
4357dbc49323: Pushed
3318df26b12a: Pushed
ff3ded16ccfd: Pushed
47ee2d19f81a: Layer already exists
c64bdd999388: Pushed
ab9d251e27cb: Layer already exists
a9e4c9343539: Layer already exists
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
096944600e60: Pushed
42dc884742a5: Pushed
20211129124331: digest: sha256:5afc79ff4826b7c35eb584305732884be6469ea09f5d18ebc32d647783d8cb11 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 29, 2021 12:45:33 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 29, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 29, 2021 12:45:34 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 29, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 29, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 29, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 29, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 29, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash a6ebe71c86a4d0b91fba17c93acd7e4c1394caac55c55086f20eddc03eac1852> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-puvnHIak0LkfuhfJOs1-TBOUyqxVxVCG8g7dwD6sGFI.pb
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 29, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094]
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 29, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968]
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 29, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 29, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-29_04_45_38-16428249400959612030?project=apache-beam-testing
Nov 29, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-29_04_45_38-16428249400959612030
Nov 29, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-29_04_45_38-16428249400959612030
Nov 29, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-29T12:45:46.529Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-vlu2. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 29, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:52.248Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 29, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.144Z: Expanding SplittableParDo operations into optimizable parts.
Nov 29, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.179Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.250Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.400Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.447Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.516Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.606Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.661Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.696Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.725Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.754Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.781Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.808Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.836Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.865Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.891Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.925Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.959Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:53.992Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.048Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.076Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.110Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.145Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.177Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.209Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.243Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.275Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.302Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.330Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:45:54.794Z: Starting 5 ****s in us-central1-a...
Nov 29, 2021 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:46:18.736Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 29, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:46:39.962Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 29, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:47:34.972Z: Workers have started successfully.
Nov 29, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T12:47:35.006Z: Workers have started successfully.
Nov 29, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T16:00:32.287Z: Cancel request is committed for workflow job: 2021-11-29_04_45_38-16428249400959612030.
Nov 29, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T16:00:32.430Z: Cleaning up.
Nov 29, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T16:00:32.526Z: Stopping **** pool...
Nov 29, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T16:00:32.572Z: Stopping **** pool...
Nov 29, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T16:02:49.254Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 29, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-29T16:02:49.287Z: Worker pool stopped.
Nov 29, 2021 4:02:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-29_04_45_38-16428249400959612030 finished with status CANCELLED.
Load test results for test (ID): d92f23c9-7209-4584-9499-fd264ef7418a and timestamp: 2021-11-29T12:45:34.180000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11544.278
dataflow_v2_java11_total_bytes_count             2.52293675E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211129124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5afc79ff4826b7c35eb584305732884be6469ea09f5d18ebc32d647783d8cb11
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211129124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5afc79ff4826b7c35eb584305732884be6469ea09f5d18ebc32d647783d8cb11]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211129124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5afc79ff4826b7c35eb584305732884be6469ea09f5d18ebc32d647783d8cb11])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5afc79ff4826b7c35eb584305732884be6469ea09f5d18ebc32d647783d8cb11
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5afc79ff4826b7c35eb584305732884be6469ea09f5d18ebc32d647783d8cb11
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5afc79ff4826b7c35eb584305732884be6469ea09f5d18ebc32d647783d8cb11].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 44s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/u4qqon5swkz4u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #164

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/164/display/redirect?page=changes>

Changes:

[vitaly.ivanov] [BEAM-13269] Data exceeds database column capacity error while inserting

[vitaly.ivanov] [BEAM-13269] Data exceeds database column capacity error while inserting

[noreply] Merge pull request #16074 from [BEAM-13323][Playground] [Bugfix] Fix CI


------------------------------------------
[...truncated 48.49 KB...]
1780a986a391: Preparing
260779d49dfb: Preparing
d91179b94dcc: Preparing
255e6d95ccc8: Preparing
7d006af8d29d: Preparing
25cd40d8dbc1: Preparing
73d0dba98448: Preparing
603a9b8b9b10: Preparing
66cd4191185b: Preparing
05207a53e914: Preparing
6cadd39f5402: Preparing
4e3f4c4e2855: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
603a9b8b9b10: Waiting
25cd40d8dbc1: Waiting
73d0dba98448: Waiting
66cd4191185b: Waiting
ab9d251e27cb: Waiting
5499f2905579: Waiting
a36ba9e322f7: Waiting
05207a53e914: Waiting
8a5844586fdb: Waiting
a4aba4e59b40: Waiting
a9e4c9343539: Waiting
4e3f4c4e2855: Waiting
6cadd39f5402: Waiting
47ee2d19f81a: Waiting
7d006af8d29d: Pushed
d91179b94dcc: Pushed
1780a986a391: Pushed
25cd40d8dbc1: Pushed
255e6d95ccc8: Pushed
66cd4191185b: Pushed
73d0dba98448: Pushed
603a9b8b9b10: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
6cadd39f5402: Pushed
4e3f4c4e2855: Pushed
05207a53e914: Pushed
260779d49dfb: Pushed
20211128124334: digest: sha256:61273d3b426f6d6c7f2f0eaf0310d35b1a45d480d3805d3e0e155670a45e8969 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 28, 2021 12:45:50 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 28, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 28, 2021 12:45:51 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 28, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 28, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 28, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 28, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash 5c6cb0c19fbfbb769aa40edcacb5d42a701f556d8a2b61a76d71c56beab9479f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XGywwZ-_u3aapA7crLXUKnAfVW2KK2GnbXHFa-q5R58.pb
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 28, 2021 12:45:56 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094]
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 28, 2021 12:45:56 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968]
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 28, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-28_04_45_56-13413062277742559395?project=apache-beam-testing
Nov 28, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-28_04_45_56-13413062277742559395
Nov 28, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-28_04_45_56-13413062277742559395
Nov 28, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-28T12:46:03.731Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-vbej. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 28, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:08.503Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 28, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.007Z: Expanding SplittableParDo operations into optimizable parts.
Nov 28, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.027Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 28, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.080Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.139Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.195Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.260Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.357Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.390Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.435Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.467Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.502Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.533Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.566Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.598Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.620Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.653Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.686Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.718Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.760Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.806Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.839Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.864Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.884Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.915Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.949Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:09.982Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:10.013Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:10.057Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:10.089Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 28, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:10.438Z: Starting 5 ****s in us-central1-a...
Nov 28, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:38.523Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 28, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:46:50.430Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 28, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:47:44.482Z: Workers have started successfully.
Nov 28, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T12:47:44.516Z: Workers have started successfully.
Nov 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T16:00:31.528Z: Cancel request is committed for workflow job: 2021-11-28_04_45_56-13413062277742559395.
Nov 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T16:00:31.945Z: Cleaning up.
Nov 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T16:00:32.037Z: Stopping **** pool...
Nov 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T16:00:32.220Z: Stopping **** pool...
Nov 28, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T16:02:47.442Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 28, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-28T16:02:47.479Z: Worker pool stopped.
Nov 28, 2021 4:02:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-28_04_45_56-13413062277742559395 finished with status CANCELLED.
Load test results for test (ID): 3a652a54-0a2b-46da-8ad7-9d83a78e2d24 and timestamp: 2021-11-28T12:45:51.496000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11501.868
dataflow_v2_java11_total_bytes_count             2.36728306E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211128124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:61273d3b426f6d6c7f2f0eaf0310d35b1a45d480d3805d3e0e155670a45e8969
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211128124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:61273d3b426f6d6c7f2f0eaf0310d35b1a45d480d3805d3e0e155670a45e8969]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211128124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:61273d3b426f6d6c7f2f0eaf0310d35b1a45d480d3805d3e0e155670a45e8969])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:61273d3b426f6d6c7f2f0eaf0310d35b1a45d480d3805d3e0e155670a45e8969
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:61273d3b426f6d6c7f2f0eaf0310d35b1a45d480d3805d3e0e155670a45e8969
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:61273d3b426f6d6c7f2f0eaf0310d35b1a45d480d3805d3e0e155670a45e8969].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 37s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wt2txti7cn7xi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #163

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/163/display/redirect>

Changes:


------------------------------------------
[...truncated 48.08 KB...]
1c6a92bd04bd: Preparing
2e84b2cd629a: Preparing
3233a81acadc: Preparing
5461a9724d64: Preparing
bdcb92e8950b: Preparing
e12deef2713a: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
2e84b2cd629a: Waiting
5461a9724d64: Waiting
8a5844586fdb: Preparing
3233a81acadc: Waiting
a4aba4e59b40: Preparing
5499f2905579: Preparing
bdcb92e8950b: Waiting
a9e4c9343539: Waiting
47ee2d19f81a: Waiting
a36ba9e322f7: Preparing
8a5844586fdb: Waiting
a4aba4e59b40: Waiting
5499f2905579: Waiting
a5d8ec5405f9: Waiting
a36ba9e322f7: Waiting
ab9d251e27cb: Waiting
8da5421b804b: Pushed
909ecb9172e0: Pushed
2be90649daa8: Pushed
a5d8ec5405f9: Pushed
dd91861233dd: Pushed
98901f8af3ee: Pushed
2e84b2cd629a: Pushed
3233a81acadc: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
bdcb92e8950b: Pushed
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
e12deef2713a: Pushed
1c6a92bd04bd: Pushed
5461a9724d64: Pushed
20211127124332: digest: sha256:c44ba3d6e18b4542f8294f3b50b46035b75f98347237d9d9b8b821045e9ca16d size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 27, 2021 12:45:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 27, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 27, 2021 12:45:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 27, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 27, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 27, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 27, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 27, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash 43eb46937e01796d53e29ba9dd6cc2a2e764819e4b77a5fe014d88429ba4b374> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Q-tGk34BeW1T4pup3WzCoudkgZ5Ld6X-AU2IQpuks3Q.pb
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 27, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6]
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 27, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99]
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 27, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-27_04_45_23-6887480424228758737?project=apache-beam-testing
Nov 27, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-27_04_45_23-6887480424228758737
Nov 27, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-27_04_45_23-6887480424228758737
Nov 27, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-27T12:45:29.329Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-tsvn. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:34.097Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:34.661Z: Expanding SplittableParDo operations into optimizable parts.
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:34.702Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:34.774Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:34.842Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:34.873Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:34.950Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.061Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.093Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.119Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.151Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.189Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.222Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.257Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.290Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.336Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.374Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.403Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.437Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 27, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.471Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.506Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.554Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.580Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.612Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.643Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.682Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.717Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.755Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.784Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:35.818Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 27, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:45:36.243Z: Starting 5 ****s in us-central1-a...
Nov 27, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:46:02.233Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 27, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:46:19.385Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 27, 2021 12:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:47:17.737Z: Workers have started successfully.
Nov 27, 2021 12:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T12:47:17.769Z: Workers have started successfully.
Nov 27, 2021 2:12:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-27T14:12:36.310Z: Staged package amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar' is inaccessible.
Nov 27, 2021 2:12:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-27T14:12:42.146Z: Staged package jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar' is inaccessible.
Nov 27, 2021 2:12:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-27T14:12:42.188Z: Staged package jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar' is inaccessible.
Nov 27, 2021 2:12:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-27T14:12:43.367Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 27, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T16:00:34.493Z: Cancel request is committed for workflow job: 2021-11-27_04_45_23-6887480424228758737.
Nov 27, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T16:00:34.521Z: Cleaning up.
Nov 27, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T16:00:34.590Z: Stopping **** pool...
Nov 27, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T16:00:34.656Z: Stopping **** pool...
Nov 27, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T16:02:55.989Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 27, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-27T16:02:56.023Z: Worker pool stopped.
Nov 27, 2021 4:03:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-27_04_45_23-6887480424228758737 finished with status CANCELLED.
Load test results for test (ID): 55206b0c-aefb-4f05-a172-41d13136a7e2 and timestamp: 2021-11-27T12:45:18.190000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11571.355
dataflow_v2_java11_total_bytes_count             2.07269107E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211127124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c44ba3d6e18b4542f8294f3b50b46035b75f98347237d9d9b8b821045e9ca16d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211127124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c44ba3d6e18b4542f8294f3b50b46035b75f98347237d9d9b8b821045e9ca16d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211127124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c44ba3d6e18b4542f8294f3b50b46035b75f98347237d9d9b8b821045e9ca16d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c44ba3d6e18b4542f8294f3b50b46035b75f98347237d9d9b8b821045e9ca16d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c44ba3d6e18b4542f8294f3b50b46035b75f98347237d9d9b8b821045e9ca16d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c44ba3d6e18b4542f8294f3b50b46035b75f98347237d9d9b8b821045e9ca16d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ixk722wtnlutc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #162

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/162/display/redirect>

Changes:


------------------------------------------
[...truncated 48.35 KB...]

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
ab28e4d9ed50: Preparing
e841475cba0f: Preparing
fbf5c5c2b0e8: Preparing
c7567abe3a99: Preparing
5eb68597851d: Preparing
441d45f2bf20: Preparing
4ddfb9f670e8: Preparing
7534a823251f: Preparing
fb28fc033933: Preparing
65e7c91c4fed: Preparing
11c4b41de4a2: Preparing
9a148d5c3fba: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
441d45f2bf20: Waiting
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
4ddfb9f670e8: Waiting
a4aba4e59b40: Preparing
5499f2905579: Preparing
7534a823251f: Waiting
fb28fc033933: Waiting
a36ba9e322f7: Preparing
9a148d5c3fba: Waiting
a4aba4e59b40: Waiting
11c4b41de4a2: Waiting
ab9d251e27cb: Waiting
8a5844586fdb: Waiting
5499f2905579: Waiting
a36ba9e322f7: Waiting
65e7c91c4fed: Waiting
5eb68597851d: Pushed
e841475cba0f: Pushed
fbf5c5c2b0e8: Pushed
ab28e4d9ed50: Pushed
441d45f2bf20: Pushed
c7567abe3a99: Pushed
7534a823251f: Pushed
fb28fc033933: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
11c4b41de4a2: Pushed
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
4ddfb9f670e8: Pushed
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
9a148d5c3fba: Pushed
65e7c91c4fed: Pushed
20211126124333: digest: sha256:2d7c6b9211345efc3330d846b7254c9cb037a2915090c98db71c8194739c82bc size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 26, 2021 12:45:32 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 26, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 26, 2021 12:45:33 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 26, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 26, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 26, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 26, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash c956899fc4a0767482982fe4bb216f478b48aefc2814e53d4e941357e1c2fd4c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yVaJn8SgdnSCmC_kuyFvR4tIrvwoFOU9TpQTV-HC_Uw.pb
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 26, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094]
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 26, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968]
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 26, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-26_04_45_38-3752755376803788383?project=apache-beam-testing
Nov 26, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-26_04_45_38-3752755376803788383
Nov 26, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-26_04_45_38-3752755376803788383
Nov 26, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-26T12:45:45.719Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-r6oa. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:48.773Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.431Z: Expanding SplittableParDo operations into optimizable parts.
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.463Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.532Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.594Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.616Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.677Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.774Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.796Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.823Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.879Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.901Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.931Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.969Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:49.999Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.041Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.064Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.095Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.137Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.168Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.199Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.224Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.259Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.294Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.320Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.356Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.387Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.416Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.456Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.479Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 26, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:45:50.840Z: Starting 5 ****s in us-central1-a...
Nov 26, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:46:00.606Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 26, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:46:33.804Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 26, 2021 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:47:29.439Z: Workers have started successfully.
Nov 26, 2021 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T12:47:29.472Z: Workers have started successfully.
Nov 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T16:00:34.322Z: Cancel request is committed for workflow job: 2021-11-26_04_45_38-3752755376803788383.
Nov 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T16:00:34.387Z: Cleaning up.
Nov 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T16:00:34.458Z: Stopping **** pool...
Nov 26, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T16:00:34.519Z: Stopping **** pool...
Nov 26, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T16:03:01.664Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 26, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-26T16:03:01.698Z: Worker pool stopped.
Nov 26, 2021 4:03:08 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-26_04_45_38-3752755376803788383 finished with status CANCELLED.
Load test results for test (ID): 7f01b3c9-ec6b-4d6e-94fe-c54db1e830d3 and timestamp: 2021-11-26T12:45:32.959000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11548.83
dataflow_v2_java11_total_bytes_count              1.8461165E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211126124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d7c6b9211345efc3330d846b7254c9cb037a2915090c98db71c8194739c82bc
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211126124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d7c6b9211345efc3330d846b7254c9cb037a2915090c98db71c8194739c82bc]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211126124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d7c6b9211345efc3330d846b7254c9cb037a2915090c98db71c8194739c82bc])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d7c6b9211345efc3330d846b7254c9cb037a2915090c98db71c8194739c82bc
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d7c6b9211345efc3330d846b7254c9cb037a2915090c98db71c8194739c82bc
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d7c6b9211345efc3330d846b7254c9cb037a2915090c98db71c8194739c82bc].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 53s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hdvg77gib4llm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #161

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/161/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-13015] Simplify

[Luke Cwik] [BEAM-13015] Simplify DataStreamsDecoder by moving hasNext logic to next


------------------------------------------
[...truncated 52.04 KB...]
INFO: Uploading <111716 bytes, hash 4f49a4178dc3c51fd74d9171ced484a1e1bd6838f435c258d043b0bd84ff08b5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-T0mkF43DxR_XTZFxztSEoeG9aDj0NcJY0EOwvYT_CLU.pb
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 25, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a]
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 25, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f]
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 25, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-25_04_45_36-14387302763703776575?project=apache-beam-testing
Nov 25, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-25_04_45_36-14387302763703776575
Nov 25, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-25_04_45_36-14387302763703776575
Nov 25, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-25T12:45:44.000Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-xij8. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 25, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:48.540Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 25, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.222Z: Expanding SplittableParDo operations into optimizable parts.
Nov 25, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.274Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 25, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.371Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.458Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.496Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.563Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.668Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.700Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.734Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.759Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.793Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.828Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.859Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.896Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.921Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.948Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:49.974Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.008Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.037Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.062Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.097Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.121Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.149Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.238Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.379Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.444Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.542Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.590Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 25, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:50.636Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 25, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:45:51.053Z: Starting 5 ****s in us-central1-a...
Nov 25, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:46:15.449Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 25, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:46:35.424Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 25, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:47:32.822Z: Workers have started successfully.
Nov 25, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T12:47:32.862Z: Workers have started successfully.
Nov 25, 2021 1:24:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:52.094Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Nov 25, 2021 1:24:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:52.309Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.048Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.095Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.137Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.188Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.236Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.289Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.339Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.389Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.437Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.502Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.546Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.636Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:54.680Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Nov 25, 2021 1:24:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:24:55.041Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Nov 25, 2021 1:24:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-25T13:24:55.199Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 25, 2021 1:27:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-25T13:27:55.781Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 25, 2021 1:30:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:52.167Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Nov 25, 2021 1:30:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:52.292Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:53.931Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.026Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.078Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.132Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.178Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.240Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.288Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.336Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.376Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.439Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.479Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.547Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Nov 25, 2021 1:30:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:54.594Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Nov 25, 2021 1:30:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-25T13:30:55.010Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Nov 25, 2021 1:30:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-25T13:30:55.152Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 25, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T16:00:32.346Z: Cancel request is committed for workflow job: 2021-11-25_04_45_36-14387302763703776575.
Nov 25, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T16:00:32.386Z: Cleaning up.
Nov 25, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T16:00:32.456Z: Stopping **** pool...
Nov 25, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T16:00:32.516Z: Stopping **** pool...
Nov 25, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T16:02:51.241Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 25, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-25T16:02:51.280Z: Worker pool stopped.
Nov 25, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-25_04_45_36-14387302763703776575 finished with status CANCELLED.
Load test results for test (ID): 630cae55-715d-429c-a625-304c78a81027 and timestamp: 2021-11-25T12:45:30.711000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11475.977
dataflow_v2_java11_total_bytes_count             1.94129639E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211125124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b24df267d99449fe3394d08dfd4d79b8991c418e36cbadf82cc63e48619a8582
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211125124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b24df267d99449fe3394d08dfd4d79b8991c418e36cbadf82cc63e48619a8582]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211125124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b24df267d99449fe3394d08dfd4d79b8991c418e36cbadf82cc63e48619a8582])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b24df267d99449fe3394d08dfd4d79b8991c418e36cbadf82cc63e48619a8582
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b24df267d99449fe3394d08dfd4d79b8991c418e36cbadf82cc63e48619a8582
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b24df267d99449fe3394d08dfd4d79b8991c418e36cbadf82cc63e48619a8582].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 43s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/prwnunraitcqc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #160

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/160/display/redirect?page=changes>

Changes:

[noreply] Increase the timeout for LoadTest Jenkins jobs 4h->12h.

[noreply] [BEAM-13073] UseParallelGC on Java container

[noreply] Merge pull request #16002 from [BEAM-13101] [Playground] load examples

[noreply] Merge pull request #16016 from [BEAM-13089][Playground] Enable

[noreply] Merge pull request #16046 from [BEAM-13295] [Playground] [Bugfix]

[noreply] Merge pull request #16048 from [Beam-13256] [Playground] Implement an

[noreply] Merge pull request #16026 from [BEAM-13208][Playground] Separate stderr

[noreply] Append to state in chunks. (#15983)

[Luke Cwik] [BEAM-13313] Fix WindowingStrategy proto to use snake case field names.

[noreply] [BEAM-12829] Never copy .gogradle/** for Python. (#16059)


------------------------------------------
[...truncated 48.37 KB...]

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
51bea171d5f6: Preparing
838006e56d1c: Preparing
eaf55fdd28d2: Preparing
c70e269b2ef0: Preparing
cb59a9e06435: Preparing
5763c90fc677: Preparing
10a991f910d4: Preparing
536966cdeac6: Preparing
0416872d8098: Preparing
fddc5bcb125e: Preparing
19397597ab6b: Preparing
80463cec54a1: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
80463cec54a1: Waiting
a9e4c9343539: Waiting
10a991f910d4: Waiting
47ee2d19f81a: Waiting
536966cdeac6: Waiting
ab9d251e27cb: Waiting
19397597ab6b: Waiting
fddc5bcb125e: Waiting
5763c90fc677: Waiting
8a5844586fdb: Waiting
a4aba4e59b40: Waiting
5499f2905579: Waiting
eaf55fdd28d2: Pushed
cb59a9e06435: Pushed
838006e56d1c: Pushed
5763c90fc677: Pushed
51bea171d5f6: Pushed
c70e269b2ef0: Pushed
536966cdeac6: Pushed
0416872d8098: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
10a991f910d4: Pushed
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
19397597ab6b: Pushed
a4aba4e59b40: Layer already exists
80463cec54a1: Pushed
a36ba9e322f7: Layer already exists
5499f2905579: Layer already exists
fddc5bcb125e: Pushed
20211124124332: digest: sha256:ec7ad05fef57bf2159bc12994cea371557758d553489bcf5a5cb864fe099730d size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 24, 2021 12:50:49 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 24, 2021 12:50:49 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 24, 2021 12:50:50 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 24, 2021 12:50:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 24, 2021 12:50:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 24, 2021 12:50:53 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 24, 2021 12:50:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 24, 2021 12:50:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash d87390877bb0bde0a3029ff92ab2bd5adbeeb9738bed6cb0fd51f3e5e0bb7052> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2HOQh3uwveCjAp_5KrK9WtvuuXOL7Wyw_VHz5eC7cFI.pb
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 24, 2021 12:50:54 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6]
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 24, 2021 12:50:54 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99]
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 24, 2021 12:50:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 24, 2021 12:50:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 24, 2021 12:50:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-24_04_50_55-1265207853464830809?project=apache-beam-testing
Nov 24, 2021 12:50:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-24_04_50_55-1265207853464830809
Nov 24, 2021 12:50:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-24_04_50_55-1265207853464830809
Nov 24, 2021 12:51:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-24T12:51:02.033Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-jih7. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.151Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.579Z: Expanding SplittableParDo operations into optimizable parts.
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.626Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.683Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.767Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.795Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.861Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:06.973Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.006Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.048Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.085Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.114Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.148Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.181Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.217Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.249Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.284Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.315Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.348Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.383Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.405Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.429Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.451Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.477Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.508Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.534Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 24, 2021 12:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.560Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 24, 2021 12:51:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.589Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 24, 2021 12:51:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.613Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 24, 2021 12:51:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.657Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 24, 2021 12:51:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:07.982Z: Starting 5 ****s in us-central1-a...
Nov 24, 2021 12:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:41.877Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 24, 2021 12:51:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:51:52.623Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 24, 2021 12:52:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:52:46.146Z: Workers have started successfully.
Nov 24, 2021 12:52:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T12:52:46.184Z: Workers have started successfully.
Nov 24, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T16:00:24.257Z: Cancel request is committed for workflow job: 2021-11-24_04_50_55-1265207853464830809.
Nov 24, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T16:00:24.336Z: Cleaning up.
Nov 24, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T16:00:24.408Z: Stopping **** pool...
Nov 24, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T16:00:24.466Z: Stopping **** pool...
Nov 24, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T16:02:51.787Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 24, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-24T16:02:51.824Z: Worker pool stopped.
Nov 24, 2021 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-24_04_50_55-1265207853464830809 finished with status CANCELLED.
Load test results for test (ID): 5ebf6421-0287-43dc-9698-1208f36d72e8 and timestamp: 2021-11-24T12:50:50.252000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11217.102
dataflow_v2_java11_total_bytes_count             1.71173265E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211124124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec7ad05fef57bf2159bc12994cea371557758d553489bcf5a5cb864fe099730d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211124124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec7ad05fef57bf2159bc12994cea371557758d553489bcf5a5cb864fe099730d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211124124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec7ad05fef57bf2159bc12994cea371557758d553489bcf5a5cb864fe099730d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec7ad05fef57bf2159bc12994cea371557758d553489bcf5a5cb864fe099730d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec7ad05fef57bf2159bc12994cea371557758d553489bcf5a5cb864fe099730d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ec7ad05fef57bf2159bc12994cea371557758d553489bcf5a5cb864fe099730d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 44s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rrb4menh2jgx4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #159

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/159/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Make sure implicit inputs are noted in the portable graph.

[benjamin.gonzalez] [BEAM-13003] Integration Test for DebeziumIO that connects to

[noreply] Merge pull request #16032 from [BEAM-13297][Playground][Bugfix]

[noreply] Merge pull request #16031 from [BEAM-13296][Playground][Bugfix] remove

[noreply] Merge pull request #16030 from [BEAM-13219][Playground] Initialise

[noreply] Merge pull request #15996 from [BEAM-13156][Playground] Implement


------------------------------------------
[...truncated 48.24 KB...]
b74b745bfc34: Preparing
b1f85dad940c: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
d7604a28e3c8: Waiting
a4aba4e59b40: Waiting
8a5844586fdb: Waiting
6014bd277026: Waiting
b1f85dad940c: Waiting
5499f2905579: Waiting
311671353f63: Waiting
260d6aa87acb: Waiting
a9e4c9343539: Waiting
a36ba9e322f7: Waiting
231770730dba: Waiting
47ee2d19f81a: Waiting
b74b745bfc34: Waiting
ab9d251e27cb: Waiting
c0b43a29c131: Pushed
b2f605300a8b: Pushed
0f1a4dcd5c41: Pushed
7afb0064df19: Pushed
d7604a28e3c8: Pushed
1fd7139284c3: Pushed
311671353f63: Pushed
260d6aa87acb: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
b74b745bfc34: Pushed
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
b1f85dad940c: Pushed
6014bd277026: Pushed
231770730dba: Pushed
20211123124333: digest: sha256:66eef5700c211a3c64af38ce5c6146d89a8fc5ead4592cc8b04857b781f5846d size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 23, 2021 12:45:49 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 23, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 23, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 23, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 23, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash 6c0ed3d7b543d743c14eaf66a9beb0dde45cb5f451d0ecd04dd2ee20b5d80417> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bA7T17VD10PBTq9mqb6w3eRctfRR0OzQTdLuILXYBBc.pb
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 23, 2021 12:45:55 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe]
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 23, 2021 12:45:55 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17]
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 23, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 23, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-23_04_45_55-15625709898218145271?project=apache-beam-testing
Nov 23, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-23_04_45_55-15625709898218145271
Nov 23, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-23_04_45_55-15625709898218145271
Nov 23, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-23T12:46:03.975Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-g293. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 23, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:09.770Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.410Z: Expanding SplittableParDo operations into optimizable parts.
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.441Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.508Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.579Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.638Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.699Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.814Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.857Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.892Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.925Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.960Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:10.985Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.009Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.026Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.058Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.088Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.132Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.168Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.197Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.227Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.260Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.292Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.320Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.343Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.378Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.408Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.443Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.478Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.513Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 23, 2021 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:11.875Z: Starting 5 ****s in us-central1-a...
Nov 23, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:46:23.117Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 23, 2021 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:47:05.428Z: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 23, 2021 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:47:05.461Z: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
Nov 23, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:47:15.655Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 23, 2021 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:48:10.228Z: Workers have started successfully.
Nov 23, 2021 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T12:48:10.264Z: Workers have started successfully.
Nov 23, 2021 3:13:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-23T15:13:15.027Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Nov 23, 2021 3:13:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-23T15:13:15.159Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Nov 23, 2021 3:13:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-23T15:13:16.026Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 23, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T16:00:43.184Z: Cancel request is committed for workflow job: 2021-11-23_04_45_55-15625709898218145271.
Nov 23, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T16:00:43.275Z: Cleaning up.
Nov 23, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T16:00:43.345Z: Stopping **** pool...
Nov 23, 2021 4:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T16:00:43.413Z: Stopping **** pool...
Nov 23, 2021 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T16:03:07.804Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 23, 2021 4:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-23T16:03:07.848Z: Worker pool stopped.
Nov 23, 2021 4:03:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-23_04_45_55-15625709898218145271 finished with status CANCELLED.
Load test results for test (ID): ae3c8169-6cd6-45a5-847e-940829c58929 and timestamp: 2021-11-23T12:45:49.428000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11477.109
dataflow_v2_java11_total_bytes_count             2.56585283E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211123124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:66eef5700c211a3c64af38ce5c6146d89a8fc5ead4592cc8b04857b781f5846d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211123124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:66eef5700c211a3c64af38ce5c6146d89a8fc5ead4592cc8b04857b781f5846d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211123124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:66eef5700c211a3c64af38ce5c6146d89a8fc5ead4592cc8b04857b781f5846d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:66eef5700c211a3c64af38ce5c6146d89a8fc5ead4592cc8b04857b781f5846d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:66eef5700c211a3c64af38ce5c6146d89a8fc5ead4592cc8b04857b781f5846d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:66eef5700c211a3c64af38ce5c6146d89a8fc5ead4592cc8b04857b781f5846d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 58s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sx33if7sjajmm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #158

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/158/display/redirect>

Changes:


------------------------------------------
[...truncated 49.22 KB...]
72e960c5fc8a: Preparing
57223c2604cb: Preparing
da2ae2375a65: Preparing
021f027cdc5e: Preparing
c1ce556ab59f: Preparing
378a7fe2cc02: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
5c9fef3dc784: Waiting
47ee2d19f81a: Waiting
72e960c5fc8a: Waiting
ab9d251e27cb: Waiting
8a5844586fdb: Waiting
57223c2604cb: Waiting
a4aba4e59b40: Waiting
c1ce556ab59f: Waiting
5499f2905579: Waiting
021f027cdc5e: Waiting
378a7fe2cc02: Waiting
a36ba9e322f7: Waiting
da2ae2375a65: Waiting
a9e4c9343539: Waiting
c780e152516c: Pushed
1e96de073336: Pushed
80c10b07718b: Pushed
0c5a68def77b: Pushed
5c9fef3dc784: Pushed
f5aaec88dbeb: Pushed
da2ae2375a65: Pushed
57223c2604cb: Pushed
c1ce556ab59f: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
72e960c5fc8a: Pushed
a36ba9e322f7: Layer already exists
378a7fe2cc02: Pushed
021f027cdc5e: Pushed
20211122124329: digest: sha256:1ad68dc1a2997caca043965ce59064e446f0825db08c222b2f134f61bacc8b7b size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 22, 2021 12:45:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 22, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 22, 2021 12:45:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 22, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 22, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 22, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 22, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 22, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash dd3ad8e5ac38c666aaa706ccdf72ba6766832e74c2dc1a99f10bb3bec0c1e515> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3TrY5aw4xmaqpwbM33K6Z2aDLnTC3BqZ8QuzvsDB5RU.pb
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 22, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6]
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 22, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99]
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 22, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-22_04_45_22-3733807565465347251?project=apache-beam-testing
Nov 22, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-22_04_45_22-3733807565465347251
Nov 22, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-22_04_45_22-3733807565465347251
Nov 22, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-22T12:45:28.610Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-13yh. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 22, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:32.493Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.144Z: Expanding SplittableParDo operations into optimizable parts.
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.212Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.366Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.478Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.507Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.560Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.700Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.734Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.768Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.799Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.821Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.856Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.889Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.920Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.966Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:33.991Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.014Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.067Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.099Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.132Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.164Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.208Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.244Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.274Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.308Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.340Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.371Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.408Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.438Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:34.846Z: Starting 5 ****s in us-central1-a...
Nov 22, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:45:46.869Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 22, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:46:20.521Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 22, 2021 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:47:14.676Z: Workers have started successfully.
Nov 22, 2021 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T12:47:14.709Z: Workers have started successfully.
Nov 22, 2021 2:54:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-22T14:54:35.796Z: Staged package classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar' is inaccessible.
Nov 22, 2021 2:54:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-22T14:54:37.408Z: Staged package junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar' is inaccessible.
Nov 22, 2021 2:54:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-22T14:54:38.247Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 22, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T16:00:34.551Z: Cancel request is committed for workflow job: 2021-11-22_04_45_22-3733807565465347251.
Nov 22, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T16:00:34.588Z: Cleaning up.
Nov 22, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T16:00:34.658Z: Stopping **** pool...
Nov 22, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T16:00:34.726Z: Stopping **** pool...
Nov 22, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T16:03:00.489Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 22, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-22T16:03:00.527Z: Worker pool stopped.
Nov 22, 2021 4:03:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-22_04_45_22-3733807565465347251 finished with status CANCELLED.
Load test results for test (ID): 476b5c73-2aee-4ee4-8fb1-b7f7ef9a28ed and timestamp: 2021-11-22T12:45:18.020000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11564.866
dataflow_v2_java11_total_bytes_count             1.61944335E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211122124329
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ad68dc1a2997caca043965ce59064e446f0825db08c222b2f134f61bacc8b7b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211122124329]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ad68dc1a2997caca043965ce59064e446f0825db08c222b2f134f61bacc8b7b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211122124329] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ad68dc1a2997caca043965ce59064e446f0825db08c222b2f134f61bacc8b7b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ad68dc1a2997caca043965ce59064e446f0825db08c222b2f134f61bacc8b7b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ad68dc1a2997caca043965ce59064e446f0825db08c222b2f134f61bacc8b7b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ad68dc1a2997caca043965ce59064e446f0825db08c222b2f134f61bacc8b7b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 55s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/txd6con4aax3y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #157

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/157/display/redirect>

Changes:


------------------------------------------
[...truncated 47.93 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
9a78e3d29fe7: Preparing
f46ac1e2e1f8: Preparing
ff58424771b8: Preparing
9137aacd975c: Preparing
28ece229f307: Preparing
9cc44f341e20: Preparing
390dc3f99c8d: Preparing
5ab3e64cf269: Preparing
c06dacce3e5b: Preparing
7cabaff45161: Preparing
234f762dab19: Preparing
3c6a9037b0da: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
a4aba4e59b40: Waiting
7cabaff45161: Waiting
5499f2905579: Waiting
ab9d251e27cb: Waiting
8a5844586fdb: Waiting
390dc3f99c8d: Waiting
a9e4c9343539: Waiting
9cc44f341e20: Waiting
47ee2d19f81a: Waiting
5ab3e64cf269: Waiting
234f762dab19: Waiting
3c6a9037b0da: Waiting
c06dacce3e5b: Waiting
28ece229f307: Pushed
ff58424771b8: Pushed
f46ac1e2e1f8: Pushed
9a78e3d29fe7: Pushed
9cc44f341e20: Pushed
9137aacd975c: Pushed
5ab3e64cf269: Pushed
c06dacce3e5b: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
234f762dab19: Pushed
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
3c6a9037b0da: Pushed
390dc3f99c8d: Pushed
7cabaff45161: Pushed
20211121124331: digest: sha256:6ba11c2bcbbb7339217c700280f3a0997f315823d564b460174915a93cf87157 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 21, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 21, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 21, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 21, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 21, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 21, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 21, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 21, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash faa5a3598be10ac04f453c63a9557fa9e21a36ec4421cf1d408426fbc23e74f5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--qWjWYvhCsBPRTxjqVV_qeIaNuxEIc8dQIQm-8I-dPU.pb
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 21, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6]
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 21, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99]
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 21, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-21_04_45_28-6576307809437182887?project=apache-beam-testing
Nov 21, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-21_04_45_28-6576307809437182887
Nov 21, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-21_04_45_28-6576307809437182887
Nov 21, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-21T12:45:35.559Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-9n5m. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:39.551Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.122Z: Expanding SplittableParDo operations into optimizable parts.
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.158Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.206Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.301Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.342Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.409Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.576Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.626Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.694Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.730Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.763Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.790Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.855Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.896Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.938Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:40.971Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.005Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 21, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.039Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.072Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.112Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.144Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.180Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.212Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.233Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.269Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.298Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.327Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.357Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.378Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 21, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:45:41.767Z: Starting 5 ****s in us-central1-a...
Nov 21, 2021 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:46:14.978Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 21, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:46:25.307Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 21, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:47:20.797Z: Workers have started successfully.
Nov 21, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T12:47:20.834Z: Workers have started successfully.
Nov 21, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T16:00:23.808Z: Cancel request is committed for workflow job: 2021-11-21_04_45_28-6576307809437182887.
Nov 21, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T16:00:23.890Z: Cleaning up.
Nov 21, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T16:00:23.958Z: Stopping **** pool...
Nov 21, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T16:00:24.108Z: Stopping **** pool...
Nov 21, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T16:02:47.865Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 21, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-21T16:02:47.900Z: Worker pool stopped.
Nov 21, 2021 4:02:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-21_04_45_28-6576307809437182887 finished with status CANCELLED.
Load test results for test (ID): b1d10156-b474-41a6-acde-37f06af6f1b5 and timestamp: 2021-11-21T12:45:23.552000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11535.212
dataflow_v2_java11_total_bytes_count              1.9542081E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211121124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6ba11c2bcbbb7339217c700280f3a0997f315823d564b460174915a93cf87157
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211121124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6ba11c2bcbbb7339217c700280f3a0997f315823d564b460174915a93cf87157]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211121124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6ba11c2bcbbb7339217c700280f3a0997f315823d564b460174915a93cf87157])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6ba11c2bcbbb7339217c700280f3a0997f315823d564b460174915a93cf87157
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6ba11c2bcbbb7339217c700280f3a0997f315823d564b460174915a93cf87157
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6ba11c2bcbbb7339217c700280f3a0997f315823d564b460174915a93cf87157].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 39s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cpems35pgkj7i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #156

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/156/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] Add future back to Beam containers.

[noreply] [BEAM-13290] Handle disable_prime_runner_v2 flag (#16018)

[noreply] [BEAM-11936] Add configuration parameter for suppressing unusedvariable

[noreply] Merge pull request #15957 from [BEAM-13045][Playground] Implemented

[noreply] [BEAM-13286] RowJson should expect ReadableInstant for DATETIME values

[noreply] Merge pull request #16025 from [BEAM-13216] [Playground] Add client to

[ningkang0957] Updated dep and goldens for Interactive Beam screen diff tests

[ningkang0957] Removed Pillow from requirements of containers.

[noreply] Merge pull request #16013 from [BEAM-13281] [playground] Update favicon

[noreply] Merge pull request #15768 from [BEAM-13064] [Playground] Implement

[noreply] Added go code coverage report with codecov (#16020)


------------------------------------------
[...truncated 49.21 KB...]
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
bff336f6281d: Waiting
ab9d251e27cb: Preparing
ebb277931c10: Waiting
8a5844586fdb: Preparing
447d12c5c5de: Waiting
e0550e470f35: Waiting
47ee2d19f81a: Waiting
13740d6db0ba: Waiting
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
8a5844586fdb: Waiting
a4aba4e59b40: Waiting
5499f2905579: Waiting
d48ea7248abd: Waiting
f4d6d610530e: Waiting
ab9d251e27cb: Waiting
e0b518950164: Pushed
fc4b07ecff9c: Pushed
e1e243790773: Pushed
d791c195e879: Pushed
e0550e470f35: Pushed
4e971804c8a0: Pushed
bff336f6281d: Pushed
ebb277931c10: Pushed
a9e4c9343539: Layer already exists
f4d6d610530e: Pushed
447d12c5c5de: Pushed
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
13740d6db0ba: Pushed
8a5844586fdb: Layer already exists
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
d48ea7248abd: Pushed
20211120124333: digest: sha256:b1c7edf15e98454213402b589ec0a12be1c66ebc962cd0e81f04a67d8d31cc42 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 20, 2021 12:45:31 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 20, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 20, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 20, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 20, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 20, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash 9a2b094cc6ba95bfbeb4a9c4e03d5eddaaf202c31a19e7ac4190b53e5be996b0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-misJTMa6lb--tKnE4D1e3aryAsMaGeesQZC1PlvplrA.pb
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 20, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe]
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 20, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17]
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 20, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 20, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-20_04_45_35-5493442597811009091?project=apache-beam-testing
Nov 20, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-20_04_45_35-5493442597811009091
Nov 20, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-20_04_45_35-5493442597811009091
Nov 20, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-20T12:45:43.018Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-ehn9. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 20, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.053Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 20, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.601Z: Expanding SplittableParDo operations into optimizable parts.
Nov 20, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.631Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 20, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.699Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 20, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.768Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 20, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.832Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 20, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.897Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:47.992Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.022Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.051Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.084Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.120Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.146Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.167Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.198Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.223Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.255Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.282Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.331Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.365Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.397Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.432Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.456Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.483Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.510Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.545Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.581Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.608Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.638Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:48.678Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 20, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:45:49.135Z: Starting 5 ****s in us-central1-a...
Nov 20, 2021 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:46:19.772Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 20, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:46:28.633Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 20, 2021 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:47:26.271Z: Workers have started successfully.
Nov 20, 2021 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T12:47:26.298Z: Workers have started successfully.
Nov 20, 2021 1:27:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-20T13:27:50.238Z: Staged package commons-io-2.6--HfTBGYKwqFC84ZbrfyXHex-1zx0fH-NXS9ROcpzZRM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-io-2.6--HfTBGYKwqFC84ZbrfyXHex-1zx0fH-NXS9ROcpzZRM.jar' is inaccessible.
Nov 20, 2021 1:27:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-20T13:27:50.316Z: Staged package commons-lang-2.6-UPEbCfh3wpTVbyRGP0fSj5Kc9QRPZIZhwPDPuumi9Jw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang-2.6-UPEbCfh3wpTVbyRGP0fSj5Kc9QRPZIZhwPDPuumi9Jw.jar' is inaccessible.
Nov 20, 2021 1:27:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-20T13:27:51.177Z: Staged package google-cloud-pubsublite-1.2.0-5U3FD5IEkQizHwu39BBM5AgOV1a73rwGWbaivmdyeSI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-pubsublite-1.2.0-5U3FD5IEkQizHwu39BBM5AgOV1a73rwGWbaivmdyeSI.jar' is inaccessible.
Nov 20, 2021 1:27:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-20T13:27:51.622Z: Staged package grpc-google-cloud-pubsublite-v1-1.2.0-wV0rl2IQjaeB0Zqm6xgMut8w_3REeRHMW8XtX4sPX64.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-pubsublite-v1-1.2.0-wV0rl2IQjaeB0Zqm6xgMut8w_3REeRHMW8XtX4sPX64.jar' is inaccessible.
Nov 20, 2021 1:27:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-20T13:27:52.003Z: Staged package ion-java-1.0.2-DRJ7IFofzgq8KjdXoEF0hlG8ZsFc9MBZusWDOyfUcaU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/ion-java-1.0.2-DRJ7IFofzgq8KjdXoEF0hlG8ZsFc9MBZusWDOyfUcaU.jar' is inaccessible.
Nov 20, 2021 1:27:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-20T13:27:52.901Z: Staged package proto-google-cloud-pubsublite-v1-1.2.0-vWj_eMbE5U3cmHa7Ax4j7qYSkZojafmPkGvafYKuTKs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-pubsublite-v1-1.2.0-vWj_eMbE5U3cmHa7Ax4j7qYSkZojafmPkGvafYKuTKs.jar' is inaccessible.
Nov 20, 2021 1:27:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-20T13:27:53.227Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 20, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T16:00:25.676Z: Cancel request is committed for workflow job: 2021-11-20_04_45_35-5493442597811009091.
Nov 20, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T16:00:25.716Z: Cleaning up.
Nov 20, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T16:00:25.805Z: Stopping **** pool...
Nov 20, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T16:00:25.847Z: Stopping **** pool...
Nov 20, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T16:02:51.680Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 20, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-20T16:02:51.711Z: Worker pool stopped.
Nov 20, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-20_04_45_35-5493442597811009091 finished with status CANCELLED.
Load test results for test (ID): 541f0a0a-6dd7-4376-bc9c-7f05791a3430 and timestamp: 2021-11-20T12:45:31.177000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11549.51
dataflow_v2_java11_total_bytes_count             2.00664536E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211120124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1c7edf15e98454213402b589ec0a12be1c66ebc962cd0e81f04a67d8d31cc42
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211120124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1c7edf15e98454213402b589ec0a12be1c66ebc962cd0e81f04a67d8d31cc42]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211120124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1c7edf15e98454213402b589ec0a12be1c66ebc962cd0e81f04a67d8d31cc42])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1c7edf15e98454213402b589ec0a12be1c66ebc962cd0e81f04a67d8d31cc42
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1c7edf15e98454213402b589ec0a12be1c66ebc962cd0e81f04a67d8d31cc42
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1c7edf15e98454213402b589ec0a12be1c66ebc962cd0e81f04a67d8d31cc42].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 41s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2kh25e3ken3vw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #155

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/155/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Allow wildcards for java class lookup transform providers.

[Robert Bradshaw] Disallow getClass().forname().

[Robert Bradshaw] Docs, globbing, and a test.

[Robert Bradshaw] Defer expansion service creation for easier testing.

[Robert Bradshaw] Allow java jar service with classpath.

[avilovpavel6] Support GO sdk

[dpcollins] Fix race condition in Pub/Sub Lite SDF that causes it to error out when

[dpcollins] edit

[noreply] Merge pull request #15785 from Support for passing priority parameter in

[dpcollins] Restructure processor to access beam types from downcall thread

[dpcollins] Restructure processor to access beam types from downcall thread

[noreply] [BEAM-13283] Skip apache beam dependency installation from

[dpcollins] Restructure processor to access beam types from downcall thread

[noreply] Merge pull request #15956 from [BEAM-13210] [Playground]: support

[noreply] created quickstart guide for multi-language pipelines (Python) (#16001)

[dpcollins] use interface in logger to work around likely SLF4J bug

[dpcollins] revert logger class

[noreply] Revert "Allow wildcards for java class lookup transform providers."

[Valentyn Tymofieiev] Moving to 2.36.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 49.74 KB...]
6f42bffc4348: Pushed
5bbd48f7b5d8: Pushed
20211119124334: digest: sha256:3622d6a0a35337e17b60c710b6399c61252064c92dbfd6a314f20d7a58007ec5 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 19, 2021 12:45:32 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 19, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 19, 2021 12:45:33 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 19, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 19, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 19, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 19, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 19, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash 99ce5648231c57b1ef5566d8fdd091d391f22ebfc24bcd0ba953c1475ea5180c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mc5WSCMcV7HvVWbY_dCR05HyLr_CS80LqVPBR16lGAw.pb
Nov 19, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 19, 2021 12:45:37 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe]
Nov 19, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 19, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 19, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 19, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 19, 2021 12:45:37 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17]
Nov 19, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-19_04_45_38-7329796497055665608?project=apache-beam-testing
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-19_04_45_38-7329796497055665608
Nov 19, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-19_04_45_38-7329796497055665608
Nov 19, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-19T12:45:45.477Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-84ub. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:49.297Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:49.961Z: Expanding SplittableParDo operations into optimizable parts.
Nov 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:49.997Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.061Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.131Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.167Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.216Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.338Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.374Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.410Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.431Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.464Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.495Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.524Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.574Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.609Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.637Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.662Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.685Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.717Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.752Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.786Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.826Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.852Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.875Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.900Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.933Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.965Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:50.998Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:51.025Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:45:51.408Z: Starting 5 ****s in us-central1-a...
Nov 19, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:46:12.813Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 19, 2021 12:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:46:41.339Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 19, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:47:37.392Z: Workers have started successfully.
Nov 19, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T12:47:37.428Z: Workers have started successfully.
Nov 19, 2021 1:26:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T13:26:14.502Z: Workers have started successfully.
Nov 19, 2021 1:26:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T13:26:17.159Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 19, 2021 1:26:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T13:26:29.042Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 19, 2021 1:50:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-19T13:50:15.707Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Nov 19, 2021 1:50:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-19T13:50:16.953Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 19, 2021 1:53:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-19T13:53:16.977Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 19, 2021 1:56:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-19T13:56:15.747Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Nov 19, 2021 1:56:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-19T13:56:16.928Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 19, 2021 1:59:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-19T13:59:16.822Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 19, 2021 2:02:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-19T14:02:15.633Z: Staged package jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.13.0-87i8Y8vQiEl2BVWwSbyVD2k9G-KE9nB4qPOXblapNm4.jar' is inaccessible.
Nov 19, 2021 2:02:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-19T14:02:16.913Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 19, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T16:00:28.295Z: Cancel request is committed for workflow job: 2021-11-19_04_45_38-7329796497055665608.
Nov 19, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T16:00:28.335Z: Cleaning up.
Nov 19, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T16:00:28.418Z: Stopping **** pool...
Nov 19, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T16:00:28.459Z: Stopping **** pool...
Nov 19, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T16:02:45.252Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 19, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-19T16:02:45.297Z: Worker pool stopped.
Nov 19, 2021 4:02:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-19_04_45_38-7329796497055665608 finished with status CANCELLED.
Load test results for test (ID): 4977a7da-4f55-4b89-aa18-5706c0a097db and timestamp: 2021-11-19T12:45:33.169000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11547.423
dataflow_v2_java11_total_bytes_count             2.01742936E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211119124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3622d6a0a35337e17b60c710b6399c61252064c92dbfd6a314f20d7a58007ec5
Deleted: sha256:9eba0e909d2778ca8902010a04e440d732c4e577bb7fee310d52cefbc3371ad1
Deleted: sha256:6184680e7c1bbeab065fef7205c369d6f8267cbf741d86f9a2838b8765eee4e2
Deleted: sha256:9136db020dfe9aabe120d128dcbe9270b9de15c8c032133e3539499c7b9d671d
Deleted: sha256:f438aecdd2b73b67b8ac947210fb96612cc225eddc63763f7b3efad3c17e90ef
Deleted: sha256:f621043c203040adfc270951fa3cdd565def58e9891641539a2e865069453e36
Deleted: sha256:4b217410e9e2e8acebdb0c52161a9c8e74633943b406881df3de116d630a51db
Deleted: sha256:92a500d22d75367bf71c506f8a60944c8f70238da985dffb796231b63cae0bd1
Deleted: sha256:4c67424fae010648a2ac37b1b1ca0d759aced3a4b4f362c8ccb8727078fa0b8b
Deleted: sha256:4e9f9fc565c5882f886e4b0a3740a5c81478c6bbeffa8058f2fcbc06aae2576b
Deleted: sha256:7bbf4f5f1c99fba93120211dd2a26f2bb83390b2707f4210fc3678cd9fc2e81f
Deleted: sha256:01857c97d7cca3fb3cddad10543f58616c56a67b76d2a1a3c060b545bedef08c
Deleted: sha256:46fd7078d43f5abd46766eb9c04448182adbbe467ff222eb7417fabad08f76b3
Deleted: sha256:ae0bfb9f4eea4d5235c5f04ad9ca1b7c5ef2a5959d2b26284a98fad0d28a8209
Deleted: sha256:519bd39f89fa67eeba8446cb6bf4e6155d59c1dfe927cb373a714a756f845524
Deleted: sha256:7fe938f4b906d49e787396893bc6b510bb21e27e9dc9bc5178011a1458bc4e44
Deleted: sha256:c72b5f728f75805db6141ec5162f8c17f1227a7435e37ea71f7afa8312f07504
Deleted: sha256:83b1941d715acb35438eeee1334a682467957c7cf2039a8cca57b346e4b8a4d7
Deleted: sha256:42f51c0fd2822d2ec751ace46ef51c02698afe7144aad020f63f594d6d905afa
Deleted: sha256:fb02fe4285b23eb883cff62066edb6317f1d602b438a1415a19f50665bd2bd24
Deleted: sha256:6bb9732dee216db39e782562334d9d2a27eb9dd86adbcbc2ef5b8296146b6b91
Deleted: sha256:99ebc514fddeb1858e94dec53462cd579f7f31a8916d422a3ee20b962bf5a825
Deleted: sha256:5c3aad92c32eddc2e3fd4355c979d5fcfac93b2fca1197601a11b9150f0b8d59
Deleted: sha256:d3434bc12b697c4662b1336ea3dd4eeef352552642913d9be77ea4fe8422e0ea
Deleted: sha256:7686b81c5d31ff427c422fd250d62add369b440bcb52290304674edef9f1341d
Deleted: sha256:5aaf41ccbc6147b89dfc706c5c74fb02fa90dc5a7903222d8ed34f951dfa48de
Deleted: sha256:33922e9da48d23192f84f423579d621249405f8aebde02286f672ee06be736bc
Deleted: sha256:ff36df0089a9165d4f3b7fc2ea3bedf3eb2174114677fb3a896a30e29fc158a2
Deleted: sha256:d08f1e7e0327fed5fa00d03c6ac5b29d7feb147610b802167d9cbd5c3fa9b447
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211119124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3622d6a0a35337e17b60c710b6399c61252064c92dbfd6a314f20d7a58007ec5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211119124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3622d6a0a35337e17b60c710b6399c61252064c92dbfd6a314f20d7a58007ec5])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3622d6a0a35337e17b60c710b6399c61252064c92dbfd6a314f20d7a58007ec5
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3622d6a0a35337e17b60c710b6399c61252064c92dbfd6a314f20d7a58007ec5
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3622d6a0a35337e17b60c710b6399c61252064c92dbfd6a314f20d7a58007ec5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 35s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/i4bgq2wt5g2om

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #154

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/154/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-12818] Write windowed file output to a consistent subdirectory

[samuelw] [BEAM-13268] Change autosharding big query inserts to happen in parallel

[noreply] [BEAM-3293] Add check for error on NewIterable call (#16006)

[Brian Hulette] remove pkg_resources in run_generate_requirements.sh

[Brian Hulette] Run generatePythonRequirementsAll for pyarrow 6

[noreply] Update dev container tags used by Dataflow runner with unreleased SDKs.

[noreply] [BEAM-13265] Add withDeterministicRecordIdFn which allows for

[noreply] [BEAM-13183] [BEAM-8152] Use venv instead of virtualenv to create Python


------------------------------------------
[...truncated 46.38 KB...]
5fb5d1edcd09: Preparing
68dd8de761c9: Preparing
d1c7cb61286c: Preparing
97e67893cc15: Preparing
deeda51a2c9f: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
37d28c2bde63: Waiting
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
efff75d61a0b: Waiting
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
97e67893cc15: Waiting
47ee2d19f81a: Waiting
68dd8de761c9: Waiting
a9e4c9343539: Waiting
deeda51a2c9f: Waiting
ab9d251e27cb: Waiting
d1c7cb61286c: Waiting
5499f2905579: Waiting
a36ba9e322f7: Waiting
9c7007459e3b: Pushed
56b3f2b67eef: Pushed
295cc996b349: Pushed
9d76ec276e57: Pushed
285c92fa1e72: Pushed
37d28c2bde63: Pushed
5fb5d1edcd09: Pushed
68dd8de761c9: Pushed
97e67893cc15: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
efff75d61a0b: Pushed
deeda51a2c9f: Pushed
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
d1c7cb61286c: Pushed
20211118124333: digest: sha256:f777e18c24a32f8dc9446351c4388a50f2a134fb007548bb8761832699bb991c size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 18, 2021 12:45:33 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 18, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 18, 2021 12:45:34 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 18, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 18, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 18, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 18, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 18, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash e016a3666619deb9150c16f372a0c2a7d0d41898f5300488dad7c2c2f58164b6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4BajZmYZ3rkVDBbzcqDCp9DUGJj1MASI2tfCwvWBZLY.pb
Nov 18, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 18, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@177515d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6]
Nov 18, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 18, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 18, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 18, 2021 12:45:39 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99]
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-18_04_45_39-10476051663685778151?project=apache-beam-testing
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-18_04_45_39-10476051663685778151
Nov 18, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-18_04_45_39-10476051663685778151
Nov 18, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-18T12:45:46.528Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-wypa. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:55.152Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:55.979Z: Expanding SplittableParDo operations into optimizable parts.
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.016Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.100Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.187Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.231Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.317Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.446Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.490Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.537Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.571Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.610Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.656Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.705Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.730Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.785Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.813Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.856Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.907Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.955Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:56.991Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.040Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.113Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.146Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.185Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.226Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.287Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.318Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.350Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.395Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 18, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:45:57.878Z: Starting 5 ****s in us-central1-a...
Nov 18, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:46:12.418Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 18, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:46:40.907Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 18, 2021 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:47:37.313Z: Workers have started successfully.
Nov 18, 2021 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T12:47:37.369Z: Workers have started successfully.
Nov 18, 2021 3:54:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-18T15:54:58.410Z: Staged package beam-model-job-management-2.35.0-SNAPSHOT-8QB2g9ssFGaKGT56IbOMrjcQcE9XP-LgiRhvff3RHeg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.35.0-SNAPSHOT-8QB2g9ssFGaKGT56IbOMrjcQcE9XP-LgiRhvff3RHeg.jar' is inaccessible.
Nov 18, 2021 3:54:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-18T15:54:58.694Z: Staged package beam-sdks-java-extensions-arrow-2.35.0-SNAPSHOT--6o0gUFJinkN4upVAkTzRdayVoYWlnjhjcsg2E3us1A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-arrow-2.35.0-SNAPSHOT--6o0gUFJinkN4upVAkTzRdayVoYWlnjhjcsg2E3us1A.jar' is inaccessible.
Nov 18, 2021 3:55:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-18T15:54:58.904Z: Staged package beam-sdks-java-load-tests-2.35.0-SNAPSHOT-AWXw3244IpmjjLbDXVk6FhB1JxiEeTDrXIDH5IPOx9U.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-load-tests-2.35.0-SNAPSHOT-AWXw3244IpmjjLbDXVk6FhB1JxiEeTDrXIDH5IPOx9U.jar' is inaccessible.
Nov 18, 2021 3:55:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-18T15:55:02.326Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 18, 2021 3:58:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-18T15:58:02.900Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 18, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T16:00:35.930Z: Cancel request is committed for workflow job: 2021-11-18_04_45_39-10476051663685778151.
Nov 18, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T16:00:35.958Z: Cleaning up.
Nov 18, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T16:00:36.032Z: Stopping **** pool...
Nov 18, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T16:00:36.086Z: Stopping **** pool...
Nov 18, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T16:02:56.856Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 18, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-18T16:02:56.893Z: Worker pool stopped.
Nov 18, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-18_04_45_39-10476051663685778151 finished with status CANCELLED.
Load test results for test (ID): 28fa5b47-7660-4b8f-8ef1-056619b8a707 and timestamp: 2021-11-18T12:45:33.790000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11547.62
dataflow_v2_java11_total_bytes_count             2.07161488E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211118124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f777e18c24a32f8dc9446351c4388a50f2a134fb007548bb8761832699bb991c
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211118124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f777e18c24a32f8dc9446351c4388a50f2a134fb007548bb8761832699bb991c]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211118124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f777e18c24a32f8dc9446351c4388a50f2a134fb007548bb8761832699bb991c])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f777e18c24a32f8dc9446351c4388a50f2a134fb007548bb8761832699bb991c
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f777e18c24a32f8dc9446351c4388a50f2a134fb007548bb8761832699bb991c
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f777e18c24a32f8dc9446351c4388a50f2a134fb007548bb8761832699bb991c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 46s
101 actionable tasks: 73 executed, 26 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4fzlmo7wxjdtu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #153

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/153/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13178][Playground]

[Pablo Estrada] [BEAM-8033] Throwing an error if a dataSourceProviderFn is defined twice

[minherz] fix: update dlp dependency version range

[noreply] [BEAM-3293] Finish E2E implementation of MultiMap side inputs, add

[noreply] Merge pull request #15740 from [BEAM-12936][Playground] Code editor -

[noreply] Merge pull request #15976 from [Playground][BEAM-12941][Bugfix] Fix

[noreply] Merge pull request #15991 [BEAM-13251][Playground] [Bugfix] Lint Fails

[noreply] Merge pull request #15990 from [BEAM-13177][Playground] Change using of

[noreply] [BEAM-13262] Forward for metrics.SingleResult (#15993)

[noreply] Generate Python container dependencies in an automated way. (#15927)

[noreply] [BEAM-13264] Allow pyarrow up to 6.x (#15995)

[noreply] Update the go get command with v2 (#15945)

[noreply] Go Quickstart: switch to Spark 3 JobServer for better out-of-box

[noreply] Golint fixes for recent Go SDK import (#15999)

[noreply] Merge pull request #15997: [BEAM-8688] Upgrading gcsio library to latest


------------------------------------------
[...truncated 47.50 KB...]
7644e3242d71: Preparing
030d38dc9a3a: Preparing
7d9c85bf174e: Preparing
f3f68f0c3ef9: Preparing
9e10b8ced0e8: Preparing
e2c2d94ac4e7: Preparing
796fe27cf4f1: Preparing
5256a55da86b: Preparing
59c563b0823b: Preparing
42a75344940e: Preparing
5cf507d14524: Preparing
34d0a025f540: Preparing
a9e4c9343539: Preparing
47ee2d19f81a: Preparing
ab9d251e27cb: Preparing
8a5844586fdb: Preparing
a4aba4e59b40: Preparing
5499f2905579: Preparing
a36ba9e322f7: Preparing
59c563b0823b: Waiting
42a75344940e: Waiting
5cf507d14524: Waiting
a4aba4e59b40: Waiting
8a5844586fdb: Waiting
34d0a025f540: Waiting
5499f2905579: Waiting
e2c2d94ac4e7: Waiting
796fe27cf4f1: Waiting
a9e4c9343539: Waiting
a36ba9e322f7: Waiting
47ee2d19f81a: Waiting
5256a55da86b: Waiting
ab9d251e27cb: Waiting
9e10b8ced0e8: Pushed
030d38dc9a3a: Pushed
7d9c85bf174e: Pushed
f3f68f0c3ef9: Pushed
e2c2d94ac4e7: Pushed
5256a55da86b: Pushed
7644e3242d71: Pushed
59c563b0823b: Pushed
a9e4c9343539: Layer already exists
47ee2d19f81a: Layer already exists
ab9d251e27cb: Layer already exists
8a5844586fdb: Layer already exists
796fe27cf4f1: Pushed
a4aba4e59b40: Layer already exists
5499f2905579: Layer already exists
a36ba9e322f7: Layer already exists
5cf507d14524: Pushed
34d0a025f540: Pushed
42a75344940e: Pushed
20211117124333: digest: sha256:09aa9ff628f29be3fba2566fb2621496a95b5101f95c9256289a61bc609488db size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 17, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 17, 2021 12:45:30 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 17, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 17, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 17, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 17, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 17, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111716 bytes, hash cdf42de559fdb8dde31baddcb183f8d83de2ba73962ee3a8ceb6ca3d2846e1cb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zfQt5Vn9uN3jG63csYP42D3iunOWLuOozrbKPShG4cs.pb
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 17, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52ff99cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c2af006, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44032fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855]
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 17, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b34832b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48f4713c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f1868c9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088]
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 17, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-17_04_45_35-8958723133952108707?project=apache-beam-testing
Nov 17, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-17_04_45_35-8958723133952108707
Nov 17, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-17_04_45_35-8958723133952108707
Nov 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-17T12:45:42.148Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-26at. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:46.754Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.374Z: Expanding SplittableParDo operations into optimizable parts.
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.427Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.497Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.569Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.604Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.685Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.796Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.837Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.878Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.907Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.936Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:47.997Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.065Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.101Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.131Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.178Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.212Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.251Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.286Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.320Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.358Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.389Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.424Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.460Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.488Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.521Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.558Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.602Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 17, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.641Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 17, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:45:48.989Z: Starting 5 ****s in us-central1-a...
Nov 17, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:46:07.330Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 17, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:46:32.928Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 17, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:47:26.593Z: Workers have started successfully.
Nov 17, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T12:47:26.626Z: Workers have started successfully.
Nov 17, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T16:00:24.862Z: Cancel request is committed for workflow job: 2021-11-17_04_45_35-8958723133952108707.
Nov 17, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T16:00:25.174Z: Cleaning up.
Nov 17, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T16:00:25.265Z: Stopping **** pool...
Nov 17, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T16:00:25.313Z: Stopping **** pool...
Nov 17, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T16:02:54.477Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 17, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-17T16:02:54.518Z: Worker pool stopped.
Nov 17, 2021 4:03:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-17_04_45_35-8958723133952108707 finished with status CANCELLED.
Load test results for test (ID): d4d2d075-5db1-4fe5-88fa-45fb70c09531 and timestamp: 2021-11-17T12:45:30.544000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11544.927
dataflow_v2_java11_total_bytes_count             2.24061703E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211117124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:09aa9ff628f29be3fba2566fb2621496a95b5101f95c9256289a61bc609488db
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211117124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:09aa9ff628f29be3fba2566fb2621496a95b5101f95c9256289a61bc609488db]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211117124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:09aa9ff628f29be3fba2566fb2621496a95b5101f95c9256289a61bc609488db])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:09aa9ff628f29be3fba2566fb2621496a95b5101f95c9256289a61bc609488db
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:09aa9ff628f29be3fba2566fb2621496a95b5101f95c9256289a61bc609488db
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:09aa9ff628f29be3fba2566fb2621496a95b5101f95c9256289a61bc609488db].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wdpql4z5cgxcc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #152

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/152/display/redirect?page=changes>

Changes:

[brachipa] [BEAM-12737] error handling for sql calc error

[brachipa] [BEAM-12737] error handling for sql calc error

[brachipa] [BEAM-12737] error handling for sql calc error

[brachipa] [BEAM-12737] add test

[brachipa] [BEAM-12737] fix build

[brachipa] [BEAM-12737] fix build

[brachipa] revert

[brachipa] [beam-12737] cr fix

[brachipa] [beam-12737] cr fixes

[brachipa] [beam-12737] send errorTransformer to `buildPTransform`

[brachipa] [beam-12737] send errorTransformer to `buildPTransform`

[brachipa] [beam-12737] fix type `? extends POutput`

[Robert Bradshaw] Fully resolve type variables after PTransform type inference.

[brachipa] [beam-12737] add complex test, change error row type

[noreply] [BEAM-13226] Batch metrics parsing errors (#15972)

[noreply] [BEAM-13239] pass through io.EOF when decoding row (#15971)

[noreply] [BEAM-5378] Register types & funcs for examples. (#15970)

[Robert Bradshaw] Key-inferable type hints for CombinePerKey.

[Robert Bradshaw] Implement type variable binding for union types.

[Robert Bradshaw] Add comment about wildcard.

[noreply] [BEAM-3293] Add initial MultiMap side input backing type (#15975)

[Robert Bradshaw] Type inference deterministic coder fix for old Python versions.

[Robert Bradshaw] Also implement type matching for union types.

[noreply] [BEAM-13238] Bump all Dataflow Java VR timeouts (#15979)

[noreply] Merge pull request #15978: [BEAM-8691] Upgrading Java deps compatible

[noreply] [BEAM-13221] Don't drop last data. (#15939)


------------------------------------------
[...truncated 48.15 KB...]

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
a19b100d9727: Preparing
3e7bbb9a0ae1: Preparing
c3e81b40e5a4: Preparing
e416520b2a16: Preparing
47d80867c3f6: Preparing
a983ffe86b29: Preparing
06b1822f9592: Preparing
b639d92bb7ed: Preparing
537a22569349: Preparing
ef3341e27015: Preparing
fa1a9ea6ce41: Preparing
839c0169a061: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
a983ffe86b29: Waiting
ba6e5ff31f23: Waiting
9f9f651e9303: Waiting
537a22569349: Waiting
fa1a9ea6ce41: Waiting
78700b6b35d0: Waiting
62a5b8741e83: Waiting
b639d92bb7ed: Waiting
0b3c02b5d746: Waiting
36e0782f1159: Waiting
839c0169a061: Waiting
06b1822f9592: Waiting
47d80867c3f6: Pushed
c3e81b40e5a4: Pushed
3e7bbb9a0ae1: Pushed
a19b100d9727: Pushed
a983ffe86b29: Pushed
e416520b2a16: Pushed
b639d92bb7ed: Pushed
537a22569349: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
fa1a9ea6ce41: Pushed
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
839c0169a061: Pushed
62a747bf1719: Layer already exists
06b1822f9592: Pushed
ef3341e27015: Pushed
20211116124334: digest: sha256:fbb71c2337c295639bb3f17335ecb1a4c1e08f8a7161cc8227a859dd0841ee67 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 16, 2021 12:45:28 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 16, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 16, 2021 12:45:29 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 16, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 16, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 16, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 16, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 16, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111704 bytes, hash 5add93afc2ae7a0b7ef5a3a1540b4c011a754a5fb4b6dde69bc4cfa0f1cb5f15> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Wt2Tr8Kuegt-9aOhVAtMARp1Sl-0tt3mm8TPoPHLXxU.pb
Nov 16, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 16, 2021 12:45:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a]
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 16, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f]
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-16_04_45_35-14900297558574420371?project=apache-beam-testing
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-16_04_45_35-14900297558574420371
Nov 16, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-16_04_45_35-14900297558574420371
Nov 16, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-16T12:45:42.692Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-o50r. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:49.003Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:49.613Z: Expanding SplittableParDo operations into optimizable parts.
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:49.663Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:49.732Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:49.807Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:49.886Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:49.950Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 16, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.073Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.107Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.144Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.189Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.224Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.282Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.316Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.363Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.413Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.468Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.492Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.527Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.570Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.635Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.669Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.700Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.738Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.777Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.827Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.849Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.885Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.919Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:50.971Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 16, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:45:51.475Z: Starting 5 ****s in us-central1-a...
Nov 16, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:46:22.130Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 16, 2021 12:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:46:41.609Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 16, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:47:38.691Z: Workers have started successfully.
Nov 16, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T12:47:38.739Z: Workers have started successfully.
Nov 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T16:00:25.549Z: Cancel request is committed for workflow job: 2021-11-16_04_45_35-14900297558574420371.
Nov 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T16:00:25.722Z: Cleaning up.
Nov 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T16:00:25.862Z: Stopping **** pool...
Nov 16, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T16:00:25.945Z: Stopping **** pool...
Nov 16, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T16:02:53.819Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 16, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-16T16:02:53.854Z: Worker pool stopped.
Nov 16, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-16_04_45_35-14900297558574420371 finished with status CANCELLED.
Load test results for test (ID): 42417e3b-7856-41e9-982f-748d232ac7ce and timestamp: 2021-11-16T12:45:29.471000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11540.437
dataflow_v2_java11_total_bytes_count             2.22921607E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211116124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fbb71c2337c295639bb3f17335ecb1a4c1e08f8a7161cc8227a859dd0841ee67
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211116124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fbb71c2337c295639bb3f17335ecb1a4c1e08f8a7161cc8227a859dd0841ee67]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211116124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fbb71c2337c295639bb3f17335ecb1a4c1e08f8a7161cc8227a859dd0841ee67])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fbb71c2337c295639bb3f17335ecb1a4c1e08f8a7161cc8227a859dd0841ee67
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fbb71c2337c295639bb3f17335ecb1a4c1e08f8a7161cc8227a859dd0841ee67
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fbb71c2337c295639bb3f17335ecb1a4c1e08f8a7161cc8227a859dd0841ee67].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 72 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dqjkko7wblfz4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #151

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/151/display/redirect>

Changes:


------------------------------------------
[...truncated 48.84 KB...]
13166cc94378: Preparing
bc6c3f04f977: Preparing
eddac7d9be2a: Preparing
f93a0f611dd1: Preparing
2603b8f84eb7: Preparing
58383ef9b4e7: Preparing
622f37308080: Preparing
550a9a3f0f1b: Preparing
15a758d156a7: Preparing
e8434966d4f0: Preparing
cf265fc91198: Preparing
cc0b6de21ba4: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
58383ef9b4e7: Waiting
62a747bf1719: Preparing
622f37308080: Waiting
550a9a3f0f1b: Waiting
15a758d156a7: Waiting
0b3c02b5d746: Waiting
36e0782f1159: Waiting
cc0b6de21ba4: Waiting
cf265fc91198: Waiting
62a747bf1719: Waiting
78700b6b35d0: Waiting
ba6e5ff31f23: Waiting
e8434966d4f0: Waiting
62a5b8741e83: Waiting
9f9f651e9303: Waiting
eddac7d9be2a: Pushed
2603b8f84eb7: Pushed
bc6c3f04f977: Pushed
58383ef9b4e7: Pushed
13166cc94378: Pushed
550a9a3f0f1b: Pushed
f93a0f611dd1: Pushed
15a758d156a7: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
cf265fc91198: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
622f37308080: Pushed
cc0b6de21ba4: Pushed
e8434966d4f0: Pushed
20211115124331: digest: sha256:d03c90db7a93e49c3766f1eed1b5bb9c36ddd7e4e17a4976f87418cf3475382c size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 15, 2021 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 15, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 15, 2021 12:45:29 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 15, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 15, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 15, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 15, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 15, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111704 bytes, hash bd4a691a6f84633dfd221c889347d95e44f57c39bd7250ee525fe5b07788ee16> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vUppGm-EYz39IhyIk0fZXkT1fDm9clDuUl_lsHeI7hY.pb
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 15, 2021 12:45:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d8ab698]
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 15, 2021 12:45:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87]
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 15, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 15, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-15_04_45_34-10727282003328944491?project=apache-beam-testing
Nov 15, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-15_04_45_34-10727282003328944491
Nov 15, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-15_04_45_34-10727282003328944491
Nov 15, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-15T12:45:41.108Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-qxax. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:45.543Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.328Z: Expanding SplittableParDo operations into optimizable parts.
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.353Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.438Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.527Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.552Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.621Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.717Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.762Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.797Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.826Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.844Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.867Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.894Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.930Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.961Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:46.990Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.022Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.056Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.092Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.129Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.162Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.195Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.227Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.266Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.299Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.331Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.359Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.390Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.434Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 15, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:47.925Z: Starting 5 ****s in us-central1-a...
Nov 15, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:45:54.146Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 15, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:46:28.754Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 15, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:47:21.749Z: Workers have started successfully.
Nov 15, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T12:47:21.794Z: Workers have started successfully.
Nov 15, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T16:00:29.029Z: Cancel request is committed for workflow job: 2021-11-15_04_45_34-10727282003328944491.
Nov 15, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T16:00:29.100Z: Cleaning up.
Nov 15, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T16:00:29.194Z: Stopping **** pool...
Nov 15, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T16:00:29.235Z: Stopping **** pool...
Nov 15, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T16:02:57.344Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 15, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-15T16:02:57.378Z: Worker pool stopped.
Nov 15, 2021 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-15_04_45_34-10727282003328944491 finished with status CANCELLED.
Load test results for test (ID): c3c0ac09-c8f0-4933-9972-5638b16ef9ab and timestamp: 2021-11-15T12:45:28.678000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11478.655
dataflow_v2_java11_total_bytes_count               3.8981957E9
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211115124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d03c90db7a93e49c3766f1eed1b5bb9c36ddd7e4e17a4976f87418cf3475382c
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211115124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d03c90db7a93e49c3766f1eed1b5bb9c36ddd7e4e17a4976f87418cf3475382c]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211115124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d03c90db7a93e49c3766f1eed1b5bb9c36ddd7e4e17a4976f87418cf3475382c])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d03c90db7a93e49c3766f1eed1b5bb9c36ddd7e4e17a4976f87418cf3475382c
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d03c90db7a93e49c3766f1eed1b5bb9c36ddd7e4e17a4976f87418cf3475382c
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d03c90db7a93e49c3766f1eed1b5bb9c36ddd7e4e17a4976f87418cf3475382c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 49s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wpusiuywl442u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #150

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/150/display/redirect>

Changes:


------------------------------------------
[...truncated 48.26 KB...]
f10c84e98196: Preparing
647a916d6752: Preparing
ab4be44fdb6c: Preparing
c6b7c0b24692: Preparing
0ec397045672: Preparing
e9bad2710870: Preparing
f152ea72d34f: Preparing
7d73bbcb2f7f: Preparing
c511f8453bc6: Preparing
2a974265d5bc: Preparing
96bc97fcb9ff: Preparing
f84924b774ce: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
e9bad2710870: Waiting
0b3c02b5d746: Preparing
62a747bf1719: Preparing
f152ea72d34f: Waiting
c511f8453bc6: Waiting
7d73bbcb2f7f: Waiting
36e0782f1159: Waiting
78700b6b35d0: Waiting
0b3c02b5d746: Waiting
2a974265d5bc: Waiting
62a5b8741e83: Waiting
9f9f651e9303: Waiting
ba6e5ff31f23: Waiting
62a747bf1719: Waiting
96bc97fcb9ff: Waiting
f84924b774ce: Waiting
ab4be44fdb6c: Pushed
647a916d6752: Pushed
0ec397045672: Pushed
c6b7c0b24692: Pushed
f10c84e98196: Pushed
e9bad2710870: Pushed
7d73bbcb2f7f: Pushed
c511f8453bc6: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
96bc97fcb9ff: Pushed
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
f152ea72d34f: Pushed
f84924b774ce: Pushed
2a974265d5bc: Pushed
20211114124333: digest: sha256:43762bcbecb5e049d54e9a08b7b25073a5144d6eef40c8aab0a235cff2c924de size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 14, 2021 12:45:30 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 14, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 14, 2021 12:45:31 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 14, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 14, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 14, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 14, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 14, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111704 bytes, hash df40614a24002a92873d6f51887c5033d985c7c09fefa065dc851745151d8449> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-30BhSiQAKpKHPW9RiHxQM9mFx8Cf76Bl3IUXRRUdhEk.pb
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 14, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a]
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 14, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f]
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 14, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 14, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-14_04_45_36-5683192734576139481?project=apache-beam-testing
Nov 14, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-14_04_45_36-5683192734576139481
Nov 14, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-14_04_45_36-5683192734576139481
Nov 14, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-14T12:45:47.524Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-o2oq. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 14, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:51.821Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:52.768Z: Expanding SplittableParDo operations into optimizable parts.
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:52.874Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:52.947Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.018Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.041Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.097Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.193Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.234Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.262Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.289Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.311Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.342Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.365Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.387Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.420Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.452Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.492Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.520Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.578Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.603Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.627Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.663Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.688Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.742Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.777Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.800Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.836Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.861Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:53.896Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 14, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:54.229Z: Starting 5 ****s in us-central1-a...
Nov 14, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:45:58.201Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 14, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:46:34.488Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 14, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:47:33.907Z: Workers have started successfully.
Nov 14, 2021 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T12:47:33.940Z: Workers have started successfully.
Nov 14, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T16:00:26.915Z: Cancel request is committed for workflow job: 2021-11-14_04_45_36-5683192734576139481.
Nov 14, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T16:00:27.033Z: Cleaning up.
Nov 14, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T16:00:27.101Z: Stopping **** pool...
Nov 14, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T16:00:27.181Z: Stopping **** pool...
Nov 14, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T16:02:45.651Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 14, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-14T16:02:45.681Z: Worker pool stopped.
Nov 14, 2021 4:02:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-14_04_45_36-5683192734576139481 finished with status CANCELLED.
Load test results for test (ID): 5ffcbc17-ff6d-44f3-8604-517e0ea17320 and timestamp: 2021-11-14T12:45:30.910000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11489.554
dataflow_v2_java11_total_bytes_count             1.74862132E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211114124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:43762bcbecb5e049d54e9a08b7b25073a5144d6eef40c8aab0a235cff2c924de
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211114124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:43762bcbecb5e049d54e9a08b7b25073a5144d6eef40c8aab0a235cff2c924de]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211114124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:43762bcbecb5e049d54e9a08b7b25073a5144d6eef40c8aab0a235cff2c924de])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:43762bcbecb5e049d54e9a08b7b25073a5144d6eef40c8aab0a235cff2c924de
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:43762bcbecb5e049d54e9a08b7b25073a5144d6eef40c8aab0a235cff2c924de
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:43762bcbecb5e049d54e9a08b7b25073a5144d6eef40c8aab0a235cff2c924de].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 34s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/se24kacyytz6q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #149

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/149/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13009] Bugfix: Prevent dataloss in DynamoDBIO write if

[mmack] [BEAM-13009] Bugfix: Prevent dataloss in DynamoDBIO write if

[aydar.zaynutdinov] [BEAM-13224][Playground]

[Alexey Romanenko] [BEAM-4868] Bump com.amazonaws to 1.12.106

[noreply] [BEAM-13147] Fix nullability issues in aws (SDK v1 + v2) for AwsModule

[noreply] [BEAM-3293] Add encoding of map/multimap side inputs into proto (#15960)

[noreply] [BEAM-13052] Implement ProtoPlusCoder and add it to the default options

[noreply] [BEAM-13193] Add elements field in ProcessBundleRequest and ProcessBu…

[noreply] [BEAM-3304] Refactor trigger API (#15952)

[noreply] [BEAM-3293] Add NewKeyedIterable function to SideInputAdapter interface

[noreply] [BEAM-13223] Documents two new features of cross-language transforms


------------------------------------------
[...truncated 48.94 KB...]
36e0782f1159: Waiting
4efe358eb92d: Waiting
ba6e5ff31f23: Waiting
9f9f651e9303: Waiting
548ebbd63351: Waiting
5aa55cdcd3b7: Pushed
a18f9f86ccf0: Pushed
d51569dbd862: Pushed
337e6e031976: Pushed
5f0c26ac26b3: Pushed
548ebbd63351: Pushed
5ceb4016e1e8: Pushed
4efe358eb92d: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
45a48cd22bd3: Pushed
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
2b46f4037d09: Pushed
f93b95d11f1c: Pushed
d85eeaf7a18a: Pushed
20211113124333: digest: sha256:bcabd5c18e1479a4df46a226f2b252d16f9ed630894ade61aced83853f334467 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 13, 2021 12:45:21 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 13, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 13, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 13, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 13, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111704 bytes, hash 2967e31a4fd7e7a80b2376cdaad7e25d14e9ea9929b597ef3e988646365190bc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KWfjGk_X56gLI3bNqtfiXRTp6pkptZfvPpiGRjZRkLw.pb
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 13, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a]
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 13, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f]
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 13, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 13, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-13_04_45_27-7894454680376627106?project=apache-beam-testing
Nov 13, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-13_04_45_27-7894454680376627106
Nov 13, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-13_04_45_27-7894454680376627106
Nov 13, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-13T12:45:34.552Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-8en0. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:38.522Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.066Z: Expanding SplittableParDo operations into optimizable parts.
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.127Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.193Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.265Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.293Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.355Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.443Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.471Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.499Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.562Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.585Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.611Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.644Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.677Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.707Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.744Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.776Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.809Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.840Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.875Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.909Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.955Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:39.982Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:40.003Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:40.038Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:40.070Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:40.129Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:40.159Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:40.185Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:45:40.512Z: Starting 5 ****s in us-central1-a...
Nov 13, 2021 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:46:11.686Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 13, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:46:25.510Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 13, 2021 12:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:47:21.345Z: Workers have started successfully.
Nov 13, 2021 12:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T12:47:21.366Z: Workers have started successfully.
Nov 13, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T16:00:26.555Z: Cancel request is committed for workflow job: 2021-11-13_04_45_27-7894454680376627106.
Nov 13, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T16:00:26.610Z: Cleaning up.
Nov 13, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T16:00:26.706Z: Stopping **** pool...
Nov 13, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T16:00:26.745Z: Stopping **** pool...
Nov 13, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T16:02:45.063Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 13, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-13T16:02:45.106Z: Worker pool stopped.
Nov 13, 2021 4:02:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-13_04_45_27-7894454680376627106 finished with status CANCELLED.
Load test results for test (ID): 19cd8509-1910-4f6d-832d-5f0f7276db5c and timestamp: 2021-11-13T12:45:22.673000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11552.236
dataflow_v2_java11_total_bytes_count             1.95279897E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211113124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bcabd5c18e1479a4df46a226f2b252d16f9ed630894ade61aced83853f334467
Deleted: sha256:42201945660da287e915df98c55289f326627a04547468a0af76161ee7670678
Deleted: sha256:710a039a07511c2feb76251d921c58051ca7e21bf291247993fb7d4a1f393677
Deleted: sha256:b2858822efffb12059a9df456e0dec1f2441cdfb4a6ea1912042188e95d6751b
Deleted: sha256:34c907a17ab4b61b18cd76146136e389ba7aafafd590c068cc92c4622744e121
Deleted: sha256:694c40bd299d70813dbdfb275da9927e239be7150004dc1b6a79a9fd9d290b88
Deleted: sha256:1e74f10f4665b395dfccc6caeb5c1110faf7ea49d4f18a88d0903e151bf6899f
Deleted: sha256:6e2ee22f59e3e4f1bd3274328beb01792deb409292cb7c7b77da5892460fce6c
Deleted: sha256:9b907e93c92256207b20ad7e12b53bc3a105e3952538e6f041b25d81ed3886d0
Deleted: sha256:c0eac6df4091c4d900d1277418f0bfaacf0aab32b9b959f0abd56f49e8da1231
Deleted: sha256:ec839a739e511698749dfe1544b0ea5714dfe33f88e1cc3f673d56bf2f3fd6c9
Deleted: sha256:55dfe866f54c28378b04c355d28d201d8dca52e383f4ca6114af22a138c1652a
Deleted: sha256:0828d2e1fe4ec84bdaa0acefb61e879ca7a562277d20fadcdccebfb33af6bab6
Deleted: sha256:e6d178f5ba7b42ab94e0c6258bc340db5ce354df74682e0a72d6c06df23df137
Deleted: sha256:9eb135533d44d58391215a8b1875ba90d7d0b599f5a577b1ee7e4768b3b60a40
Deleted: sha256:58233ef39756774fd183667b75279f6ffc8266397607f016bbfad06d954076c2
Deleted: sha256:dd016ece6bb68c5091a47effbdeb94bfdac8d9816657d79532d22853252a1826
Deleted: sha256:e75b2ae5a24e4e3489969eee381558c028bce5f71f85a44eb5d58f0ebf133b72
Deleted: sha256:7797b5a8932a65bb997dc382eb1ae027e59a6698798a2f1eca0e6a84421cfc6e
Deleted: sha256:fd3e42c53163499cf6fa616bf04f45d1816b4caf3f79b8706ffc9a2cc683f094
Deleted: sha256:1a94b04dfaa07fb5cb681b08707c40f3281f791051686f7399db25f00ed91b78
Deleted: sha256:9a0cc664e37a3979c40cabb23f8d53a877f576503e7abe0c99346c32e3355d9c
Deleted: sha256:eb6095aad2e18da16b589c46e0e909c7ab8ed3b16442d6f7efd4079b877b628f
Deleted: sha256:eb7fcfc703c62e482882d32855789a0f69f7155c2f971a9060401e2e9bb02ed3
Deleted: sha256:50d6d305c87a3a4e2899f696a92980b41b68c43614eecbc2430db39d4c825f8a
Deleted: sha256:452df4e79eed2d3fbbbf153b8a741c47af2fd1d6f7ef7f3259160cd2e32def36
Deleted: sha256:8ad51c25810959682656f926083ca543794a8bac715c001818bf890ba369e389
Deleted: sha256:49c144cd09ca614ce300f151b5d81ba4eaab34eddebb736987224602fdbafc22
Deleted: sha256:06fc00fb81fa24768626a757697c07858452ab2ec94d35136c88b3fbd5208d46
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211113124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bcabd5c18e1479a4df46a226f2b252d16f9ed630894ade61aced83853f334467]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211113124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bcabd5c18e1479a4df46a226f2b252d16f9ed630894ade61aced83853f334467])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bcabd5c18e1479a4df46a226f2b252d16f9ed630894ade61aced83853f334467
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bcabd5c18e1479a4df46a226f2b252d16f9ed630894ade61aced83853f334467
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bcabd5c18e1479a4df46a226f2b252d16f9ed630894ade61aced83853f334467].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 34s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5275qyqmey4bo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #148

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/148/display/redirect?page=changes>

Changes:

[Kyle Weaver] Update Beam website to release 2.34.0.

[Alexey Romanenko] [BEAM-8727] Bump software.amazon.awssdk to 2.17.77

[noreply] [BEAM-12550] Parallelizable kurtosis Implementation (#15909)

[noreply] [BEAM-13016] Remove avro-python3 dependency from Beam (#15900)

[Kyle Weaver] Prepare docs for 2.34.0 release.

[noreply] [BEAM-13133] Loosen partitioning requirement for sample (#15818)

[noreply] [BEAM-13228] fix data race in metrics.store (#15946)

[noreply] [BEAM-3293] Add binding cases for MultiMap side inputs (#15943)

[noreply] [BEAM-13222] Re-enable Spanner integration tests (#15948)

[noreply] Merge pull request #15926 from [BEAM-13110][Playground] Playground

[noreply] [BEAM-13025] Disable deduplicating messages, as Dedupe is broken on

[noreply] [BEAM-13001] collect DoFn metrics for Combine (#15911)


------------------------------------------
[...truncated 48.12 KB...]

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
a869a04e1f77: Preparing
040ef4dc64d6: Preparing
dad6e3094b49: Preparing
89b0fc2d5155: Preparing
417a22b9dae1: Preparing
6958372bfa2d: Preparing
0ab501d5dd57: Preparing
cd236ee17a00: Preparing
ffbc5b28ded1: Preparing
1bf2d6c57d48: Preparing
6b769f6763e8: Preparing
7a69b1232949: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
cd236ee17a00: Waiting
ba6e5ff31f23: Preparing
6958372bfa2d: Waiting
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
1bf2d6c57d48: Waiting
62a5b8741e83: Waiting
6b769f6763e8: Waiting
0b3c02b5d746: Waiting
9f9f651e9303: Waiting
36e0782f1159: Waiting
ba6e5ff31f23: Waiting
78700b6b35d0: Waiting
dad6e3094b49: Pushed
040ef4dc64d6: Pushed
417a22b9dae1: Pushed
a869a04e1f77: Pushed
6958372bfa2d: Pushed
89b0fc2d5155: Pushed
cd236ee17a00: Pushed
ffbc5b28ded1: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
6b769f6763e8: Pushed
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
0ab501d5dd57: Pushed
7a69b1232949: Pushed
1bf2d6c57d48: Pushed
20211112124625: digest: sha256:30196a759ff24d3bfa0a6da33f36eca413fade78dc7715a230afe2a24603537d size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 12, 2021 12:48:25 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 12, 2021 12:48:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 12, 2021 12:48:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 12, 2021 12:48:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 12, 2021 12:48:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 12, 2021 12:48:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 12, 2021 12:48:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 12, 2021 12:48:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111704 bytes, hash a94b1fadaa038bcbec4f0ef2dfe124c33780d89025b6bbeb3fd8034af93dd665> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-qUsfraoDi8vsTw7y3-EkwzeA2JAltrvrP9gDSvk91mU.pb
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 12, 2021 12:48:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a]
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 12, 2021 12:48:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f]
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 12, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 12, 2021 12:48:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-12_04_48_30-10808268834329089366?project=apache-beam-testing
Nov 12, 2021 12:48:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-12_04_48_30-10808268834329089366
Nov 12, 2021 12:48:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-12_04_48_30-10808268834329089366
Nov 12, 2021 12:48:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-12T12:48:36.449Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-v0gq. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 12, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:40.580Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 12, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.243Z: Expanding SplittableParDo operations into optimizable parts.
Nov 12, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.294Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 12, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.370Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 12, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.446Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 12, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.485Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 12, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.547Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.668Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.699Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.733Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.765Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.799Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.824Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.846Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.898Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.927Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.965Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:41.988Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.032Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.076Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.103Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.147Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.189Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.219Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.248Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.291Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.330Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.364Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.393Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.426Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 12, 2021 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:42.839Z: Starting 5 ****s in us-central1-a...
Nov 12, 2021 12:48:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:48:48.710Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 12, 2021 12:49:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:49:28.086Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 12, 2021 12:50:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:50:21.864Z: Workers have started successfully.
Nov 12, 2021 12:50:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T12:50:21.906Z: Workers have started successfully.
Nov 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T16:00:28.691Z: Cancel request is committed for workflow job: 2021-11-12_04_48_30-10808268834329089366.
Nov 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T16:00:28.783Z: Cleaning up.
Nov 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T16:00:28.871Z: Stopping **** pool...
Nov 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T16:00:28.942Z: Stopping **** pool...
Nov 12, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T16:02:52.005Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 12, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-12T16:02:52.044Z: Worker pool stopped.
Nov 12, 2021 4:02:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-12_04_48_30-10808268834329089366 finished with status CANCELLED.
Load test results for test (ID): b51f9d9a-f860-44ad-aed2-1ccc8ada050b and timestamp: 2021-11-12T12:48:25.667000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11372.685
dataflow_v2_java11_total_bytes_count             2.04816701E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211112124625
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30196a759ff24d3bfa0a6da33f36eca413fade78dc7715a230afe2a24603537d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211112124625]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30196a759ff24d3bfa0a6da33f36eca413fade78dc7715a230afe2a24603537d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211112124625] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30196a759ff24d3bfa0a6da33f36eca413fade78dc7715a230afe2a24603537d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30196a759ff24d3bfa0a6da33f36eca413fade78dc7715a230afe2a24603537d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30196a759ff24d3bfa0a6da33f36eca413fade78dc7715a230afe2a24603537d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30196a759ff24d3bfa0a6da33f36eca413fade78dc7715a230afe2a24603537d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 16m 51s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dkdbg5vkihawa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #147

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/147/display/redirect?page=changes>

Changes:

[sergey.kalinin] Initial add for dockerfiles and build scripts

[sergey.kalinin] Initial add for terraform scripts for infra provision

[sergey.kalinin] Add licenses

[sergey.kalinin] Add fixes to docker plugin and also build arg for beam sdk image

[sergey.kalinin] Remove wrongly added comments

[sergey.kalinin] Add beam playground applications deploy

[sergey.kalinin] Remove wrongly added log file

[sergey.kalinin] Fix syntax

[sergey.kalinin] Added default values

[sergey.kalinin] Add default value to base image

[sergey.kalinin] Added Beam Version as build arg, also changed way of getting tags for

[sergey.kalinin] Added readme file

[sergey.kalinin] Add workflow to build backend application

[sergey.kalinin] Add license

[asottile] remove unused future / futures dependencies

[sergey.kalinin] Add frontend workflow

[sergey.kalinin] Add license to README and added reader access to bucket

[sergey.kalinin] Remove trailing whitespace

[relax] for existing tables, no need to set a schema

[avilovpavel6] Fix environment_service_test.go

[sergey.kalinin] Remove comment from dockerfile

[noreply] [BEAM-13190] Elide trailing nils (#15920)

[noreply] [BEAM-13052] Increment protobuf version to 3.19.0 (#15883)

[noreply] [BEAM-3304] refactoring trigger package in GoSDK (#15933)

[noreply] [BEAM-13001] Hook to configure sampling period for DoFn metric

[noreply] [BEAM-3293] Add MultiMap side input decomposition (#15934)

[noreply] [BEAM-12651] Exclude packages from jacoco report (#15792)

[noreply] [BEAM-13222] Skip spanner integration tests (#15942)

[noreply] BEAM-13189 Python TextIO: add escapechar feature. (#15901)


------------------------------------------
[...truncated 48.78 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
1e048c164368: Preparing
997e117e5e87: Preparing
e2e032eff0a3: Preparing
17d52d9852e0: Preparing
9d26d2bba9b6: Preparing
62eb36d48d33: Preparing
e4ba9ced659c: Preparing
98df2d3515ab: Preparing
c63ca5a1ddaa: Preparing
1920673022c2: Preparing
406725ccd9a0: Preparing
d8281c757a99: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
c63ca5a1ddaa: Waiting
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
406725ccd9a0: Waiting
ba6e5ff31f23: Waiting
d8281c757a99: Waiting
62a5b8741e83: Waiting
e4ba9ced659c: Waiting
9f9f651e9303: Waiting
62eb36d48d33: Waiting
98df2d3515ab: Waiting
0b3c02b5d746: Waiting
62a747bf1719: Waiting
78700b6b35d0: Waiting
36e0782f1159: Waiting
e2e032eff0a3: Pushed
9d26d2bba9b6: Pushed
997e117e5e87: Pushed
17d52d9852e0: Pushed
1e048c164368: Pushed
62eb36d48d33: Pushed
98df2d3515ab: Pushed
c63ca5a1ddaa: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
406725ccd9a0: Pushed
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
e4ba9ced659c: Pushed
d8281c757a99: Pushed
1920673022c2: Pushed
20211111124331: digest: sha256:7ef99c842bfc7cfa17ba0b0d1c3dbc603b4b7775aa909cd9b3eca5c9217e8a5d size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 11, 2021 12:45:25 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 11, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 11, 2021 12:45:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 11, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 11, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 11, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 11, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 11, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111704 bytes, hash 02e10b42c35c5dc81f90680922e37f28d6dd213d4a4a76d9258efa49c1132095> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AuELQsNcXcgfkGgJIuN_KNbdIT1KSnbZJY76ScETIJU.pb
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 11, 2021 12:45:31 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d8ab698, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ed91d8d]
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 11, 2021 12:45:31 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42fb8c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15eb0ae9]
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 11, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 11, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-11_04_45_32-6754862074261732272?project=apache-beam-testing
Nov 11, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-11_04_45_32-6754862074261732272
Nov 11, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-11_04_45_32-6754862074261732272
Nov 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-11T12:45:39.463Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-fgce. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 11, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:45.658Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 11, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:46.285Z: Expanding SplittableParDo operations into optimizable parts.
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:46.319Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:46.403Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:46.509Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:46.546Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:46.683Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.052Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.094Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.161Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.194Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.239Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.268Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.304Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.337Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.373Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.406Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.457Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.510Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.549Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.578Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.619Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.644Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.672Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.710Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.750Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.788Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.846Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.895Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:47.956Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 11, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:48.403Z: Starting 5 ****s in us-central1-a...
Nov 11, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:45:50.569Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 11, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:46:33.244Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 11, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:47:26.542Z: Workers have started successfully.
Nov 11, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T12:47:26.642Z: Workers have started successfully.
Nov 11, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T16:00:26.692Z: Cancel request is committed for workflow job: 2021-11-11_04_45_32-6754862074261732272.
Nov 11, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T16:00:27.022Z: Cleaning up.
Nov 11, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T16:00:27.148Z: Stopping **** pool...
Nov 11, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T16:00:27.247Z: Stopping **** pool...
Nov 11, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T16:02:49.657Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 11, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-11T16:02:49.765Z: Worker pool stopped.
Nov 11, 2021 4:02:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-11_04_45_32-6754862074261732272 finished with status CANCELLED.
Load test results for test (ID): fec54057-83e8-4360-b104-5d53221f6452 and timestamp: 2021-11-11T12:45:26.196000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11545.394
dataflow_v2_java11_total_bytes_count             3.08687739E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211111124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef99c842bfc7cfa17ba0b0d1c3dbc603b4b7775aa909cd9b3eca5c9217e8a5d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211111124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef99c842bfc7cfa17ba0b0d1c3dbc603b4b7775aa909cd9b3eca5c9217e8a5d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211111124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef99c842bfc7cfa17ba0b0d1c3dbc603b4b7775aa909cd9b3eca5c9217e8a5d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef99c842bfc7cfa17ba0b0d1c3dbc603b4b7775aa909cd9b3eca5c9217e8a5d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef99c842bfc7cfa17ba0b0d1c3dbc603b4b7775aa909cd9b3eca5c9217e8a5d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef99c842bfc7cfa17ba0b0d1c3dbc603b4b7775aa909cd9b3eca5c9217e8a5d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 39s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4ep3tcom6r23g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #146

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/146/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12976] Add Java interfaces for projection pushdown.

[Kyle Weaver] fix cast

[Kyle Weaver] [BEAM-12976] Address review comments.

[Kyle Weaver] Hard-code output tag instead of making it public.

[Kyle Weaver] revert MAIN_OUTPUT_TAG to private

[Kyle Weaver] Remove WITHOUT_FIELD_ORDERING. Field reordering will be a prerequisite

[msbukal] Minor HL7v2IO Fixes

[dpcollins] Add detection of BundleFinalizers to DoFnSignatures

[dpcollins] Add enable_prime to the unified worker experiment list.

[dpcollins] Raise an exception before trying to launch a pipeline using

[dpcollins] Fix detection to properly null check.

[Andrew Pilloud] [BEAM-13056] Expose FieldAccess in DoFnSchemaInformation

[noreply] [BEAM-11097] Create hook to enable cross-bundle side input caching

[noreply] Merge pull request #15804 from [BEAM-13109][Playground] Add processing

[noreply] Improve FhirIO LRO Counters + minor fixes (#15921)

[noreply] Merge pull request #15771 from Enable BQ Standard SQL dialect style

[noreply] [BEAM-11217] Metrics querying for Pcol metrics (#15923)


------------------------------------------
[...truncated 46.67 KB...]
494d77973dc2: Preparing
b0ad93889d25: Preparing
05beb89e30dd: Preparing
862dcfb29807: Preparing
0be7a0def9f1: Preparing
11b44f891c78: Preparing
c7df4a35b5d2: Preparing
14723a6d7dbf: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
c7df4a35b5d2: Waiting
14723a6d7dbf: Waiting
78700b6b35d0: Waiting
62a5b8741e83: Waiting
b0ad93889d25: Waiting
36e0782f1159: Waiting
05beb89e30dd: Waiting
ba6e5ff31f23: Waiting
862dcfb29807: Waiting
0be7a0def9f1: Waiting
9f9f651e9303: Waiting
11b44f891c78: Waiting
62a747bf1719: Waiting
0b3c02b5d746: Waiting
cb1589008ab5: Pushed
5fa432681ce0: Pushed
494d77973dc2: Pushed
3c78bb10b769: Pushed
b0ad93889d25: Pushed
862dcfb29807: Pushed
a77d4a74a163: Pushed
0be7a0def9f1: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
05beb89e30dd: Pushed
ba6e5ff31f23: Layer already exists
c7df4a35b5d2: Pushed
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
14723a6d7dbf: Pushed
11b44f891c78: Pushed
20211110124335: digest: sha256:c2d20cf75dc21d34c486f24a4c653116372b40d0bd74ddc7dfc361f0dccbd568 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 10, 2021 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 10, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 10, 2021 12:45:28 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 10, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 10, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 10, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 10, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 10, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <111704 bytes, hash 095dc21e99473dc0c56fb5da5312918aa98f9b34be7a287e1d2ee79091c48eab> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CV3CHplHPcDFb7XaUxKRiqmPmzS-eih-HS7nkJHEjqs.pb
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 10, 2021 12:45:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b676112, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5578be42, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e49ce2b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@136965e3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53c6f96d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@435cc7f9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4364712f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b7a52dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f93dd4e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ad5be4a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ad85136, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@737d100a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e5da86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6535117e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d1cbd0f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa13e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3af7d855, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77049094, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f88bfbe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59bbe88a]
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 10, 2021 12:45:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ba6ec50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642413d4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fb2e3fd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a09ce2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f183caa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b66322e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63538bb4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24534cb0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a50d9fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@106d77da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f9c5048, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5114b7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dd71b20, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@767f6ee7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b6c6e70, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f324455, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a894088, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@370c1968, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f0bfe17, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3206174f]
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 10, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-10_04_45_33-10542089246108965070?project=apache-beam-testing
Nov 10, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-10_04_45_33-10542089246108965070
Nov 10, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-10_04_45_33-10542089246108965070
Nov 10, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-10T12:45:49.335Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-e3id. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:54.136Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:54.854Z: Expanding SplittableParDo operations into optimizable parts.
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:54.891Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:54.971Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.037Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.074Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.150Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.260Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.299Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.336Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.360Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.402Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.449Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.474Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.509Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.549Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.594Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.634Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.680Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.706Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.742Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.779Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.816Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.850Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.874Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.913Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.944Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:55.981Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:56.001Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:56.032Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:45:56.401Z: Starting 5 ****s in us-central1-a...
Nov 10, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:46:17.735Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 10, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:46:40.991Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 10, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:46:41.022Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Nov 10, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:46:51.326Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 10, 2021 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:47:43.725Z: Workers have started successfully.
Nov 10, 2021 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T12:47:43.761Z: Workers have started successfully.
Nov 10, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T16:00:31.323Z: Cancel request is committed for workflow job: 2021-11-10_04_45_33-10542089246108965070.
Nov 10, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T16:00:31.394Z: Cleaning up.
Nov 10, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T16:00:31.510Z: Stopping **** pool...
Nov 10, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T16:00:31.575Z: Stopping **** pool...
Nov 10, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T16:02:44.046Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 10, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-10T16:02:44.101Z: Worker pool stopped.
Nov 10, 2021 4:02:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-10_04_45_33-10542089246108965070 finished with status CANCELLED.
Load test results for test (ID): dba4d411-65aa-4337-98d4-e96e40b93013 and timestamp: 2021-11-10T12:45:27.705000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11484.77
dataflow_v2_java11_total_bytes_count              1.7584729E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211110124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2d20cf75dc21d34c486f24a4c653116372b40d0bd74ddc7dfc361f0dccbd568
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211110124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2d20cf75dc21d34c486f24a4c653116372b40d0bd74ddc7dfc361f0dccbd568]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211110124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2d20cf75dc21d34c486f24a4c653116372b40d0bd74ddc7dfc361f0dccbd568])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2d20cf75dc21d34c486f24a4c653116372b40d0bd74ddc7dfc361f0dccbd568
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2d20cf75dc21d34c486f24a4c653116372b40d0bd74ddc7dfc361f0dccbd568
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c2d20cf75dc21d34c486f24a4c653116372b40d0bd74ddc7dfc361f0dccbd568].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 33s
101 actionable tasks: 72 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/oahi6vq4jfj32

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #145

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/145/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13155][Playground]

[25622840+adude3141] [BEAM-13157] add regression test for hadoop configuration on

[noreply] Merge pull request #15873 from [BEAM-13181] Remove Sharding from

[noreply] Merge pull request #15813 from [BEAM-13071] [Playground] Update run code

[Kyle Weaver] [BEAM-5172] Temporary ignore testSplit and testSizes tests waiting for a


------------------------------------------
[...truncated 49.42 KB...]
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
0ef7ed11257c: Pushed
06eaf5132039: Pushed
20211109124335: digest: sha256:b1a4ac3e92f33de968d7af6b534ba5552d6a40e93ed30cf44a04c9897b1a8edf size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 09, 2021 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 09, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 09, 2021 12:45:28 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 09, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 09, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 09, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 09, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 09, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <109683 bytes, hash f10bdd28cc35a8d6e1baad6d541d81ffc608693d335b68493f7826586a84afc4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8QvdKMw1qNbhuq1tVB2B_8YIaT0zW2hJP3gmWGqEr8Q.pb
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 09, 2021 12:45:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@617389a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1c8f6a90, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3050ac2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@265bd546, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1937eaff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e0bc8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0f2299, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33063f5b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99]
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 09, 2021 12:45:32 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c2dfa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@661d88a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0b64cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59ce792e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4860827a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@404db674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50f097b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7add838c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3662bdff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bb15351, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa822ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937]
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 09, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 09, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-09_04_45_32-15915895828950689049?project=apache-beam-testing
Nov 09, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-09_04_45_32-15915895828950689049
Nov 09, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-09_04_45_32-15915895828950689049
Nov 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T12:45:40.746Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-wf64. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:47.233Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:47.884Z: Expanding SplittableParDo operations into optimizable parts.
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:47.917Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.019Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.112Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.156Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.239Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.337Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.394Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.435Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.465Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.499Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.531Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.567Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.619Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.669Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.717Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.752Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.795Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.837Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.887Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.925Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.959Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:48.998Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:49.039Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:49.085Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:49.121Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:49.150Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:49.182Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 09, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:49.221Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 09, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:49.674Z: Starting 5 ****s in us-central1-a...
Nov 09, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:45:53.163Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 09, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:46:34.900Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 09, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:47:29.985Z: Workers have started successfully.
Nov 09, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T12:47:30.019Z: Workers have started successfully.
Nov 09, 2021 1:36:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:36:53.262Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Nov 09, 2021 1:36:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:36:53.465Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Nov 09, 2021 1:36:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:36:54.453Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 1:39:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:39:54.439Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 1:42:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:42:53.039Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Nov 09, 2021 1:42:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:42:53.135Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Nov 09, 2021 1:42:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:42:54.062Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 1:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:45:54.270Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 1:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:48:53.183Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Nov 09, 2021 1:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:48:53.313Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Nov 09, 2021 1:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:48:54.273Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 1:51:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:51:54.500Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 1:54:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:54:53.252Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Nov 09, 2021 1:54:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T13:54:53.349Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Nov 09, 2021 1:54:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:54:54.299Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 1:57:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T13:57:54.482Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 2:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T14:00:53.714Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Nov 09, 2021 2:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-09T14:00:53.832Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Nov 09, 2021 2:00:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T14:00:54.936Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 2:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-09T14:03:54.555Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 09, 2021 3:38:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T15:38:03.225Z: Workers have started successfully.
Nov 09, 2021 3:38:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T15:38:09.841Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 09, 2021 3:38:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T15:38:27.569Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 09, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T16:00:34.550Z: Cancel request is committed for workflow job: 2021-11-09_04_45_32-15915895828950689049.
Nov 09, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T16:00:34.586Z: Cleaning up.
Nov 09, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T16:00:34.682Z: Stopping **** pool...
Nov 09, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T16:00:34.745Z: Stopping **** pool...
Nov 09, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T16:02:53.255Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 09, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-09T16:02:53.296Z: Worker pool stopped.
Nov 09, 2021 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-09_04_45_32-15915895828950689049 finished with status CANCELLED.
Load test results for test (ID): 7ed0c298-cb29-4d36-87fc-df67b0b3d9a3 and timestamp: 2021-11-09T12:45:27.964000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11553.589
dataflow_v2_java11_total_bytes_count             1.92578252E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211109124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1a4ac3e92f33de968d7af6b534ba5552d6a40e93ed30cf44a04c9897b1a8edf
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211109124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1a4ac3e92f33de968d7af6b534ba5552d6a40e93ed30cf44a04c9897b1a8edf]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211109124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1a4ac3e92f33de968d7af6b534ba5552d6a40e93ed30cf44a04c9897b1a8edf])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1a4ac3e92f33de968d7af6b534ba5552d6a40e93ed30cf44a04c9897b1a8edf
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1a4ac3e92f33de968d7af6b534ba5552d6a40e93ed30cf44a04c9897b1a8edf
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b1a4ac3e92f33de968d7af6b534ba5552d6a40e93ed30cf44a04c9897b1a8edf].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 41s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qrlww2fljv6nk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #144

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/144/display/redirect>

Changes:


------------------------------------------
[...truncated 48.33 KB...]
c04a1b798a26: Preparing
b71b7c5f32ad: Preparing
183f35830e0e: Preparing
fb344681763e: Preparing
e988ae52c7d1: Preparing
a26246fcdaab: Preparing
5f1e6364ea7f: Preparing
352b42c5ec40: Preparing
09d6d0a70316: Preparing
1874387716a0: Preparing
7a66e17b3e82: Preparing
89ce2f4c7d1a: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
5f1e6364ea7f: Waiting
62a747bf1719: Preparing
352b42c5ec40: Waiting
7a66e17b3e82: Waiting
62a5b8741e83: Waiting
09d6d0a70316: Waiting
36e0782f1159: Waiting
1874387716a0: Waiting
89ce2f4c7d1a: Waiting
ba6e5ff31f23: Waiting
78700b6b35d0: Waiting
0b3c02b5d746: Waiting
9f9f651e9303: Waiting
a26246fcdaab: Waiting
62a747bf1719: Waiting
183f35830e0e: Pushed
e988ae52c7d1: Pushed
b71b7c5f32ad: Pushed
fb344681763e: Pushed
c04a1b798a26: Pushed
a26246fcdaab: Pushed
352b42c5ec40: Pushed
09d6d0a70316: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
7a66e17b3e82: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
5f1e6364ea7f: Pushed
89ce2f4c7d1a: Pushed
1874387716a0: Pushed
20211108124334: digest: sha256:f5d957cb4f8eab9dfad32286fe173af315dd66545b7e0e7591f374b7a6d4ca3d size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 08, 2021 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 08, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 08, 2021 12:45:28 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 08, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 08, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 08, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 08, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 08, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <109683 bytes, hash f045e696f2a6625bc71736ec88f7c9aea5b3d9aef59cbb7ce272d76931fada9f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8EXmlvKmYlvHFzbsiPfJrqWz2a71nLt84nLXaTH62p8.pb
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 08, 2021 12:45:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@617389a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1c8f6a90, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3050ac2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@265bd546, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1937eaff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e0bc8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0f2299, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33063f5b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99]
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 08, 2021 12:45:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c2dfa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@661d88a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0b64cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59ce792e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4860827a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@404db674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50f097b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7add838c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3662bdff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bb15351, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa822ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937]
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 08, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 08, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-08_04_45_33-7389419280968439881?project=apache-beam-testing
Nov 08, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-08_04_45_33-7389419280968439881
Nov 08, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-08_04_45_33-7389419280968439881
Nov 08, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-08T12:45:40.883Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-x9nq. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 08, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:45.194Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:45.926Z: Expanding SplittableParDo operations into optimizable parts.
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:45.956Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.025Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.088Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.123Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.167Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.277Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.314Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.349Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.371Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.404Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.436Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.480Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.513Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.546Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.571Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.607Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.627Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.652Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.676Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.702Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.725Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.757Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.791Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.824Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.857Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.892Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.927Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:46.960Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 08, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:45:47.314Z: Starting 5 ****s in us-central1-a...
Nov 08, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:46:18.977Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 08, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:46:32.171Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 08, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:47:28.920Z: Workers have started successfully.
Nov 08, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T12:47:28.967Z: Workers have started successfully.
Nov 08, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T16:00:32.838Z: Cancel request is committed for workflow job: 2021-11-08_04_45_33-7389419280968439881.
Nov 08, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T16:00:32.921Z: Cleaning up.
Nov 08, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T16:00:33.000Z: Stopping **** pool...
Nov 08, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T16:00:33.114Z: Stopping **** pool...
Nov 08, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T16:02:55.783Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 08, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-08T16:02:55.836Z: Worker pool stopped.
Nov 08, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-08_04_45_33-7389419280968439881 finished with status CANCELLED.
Load test results for test (ID): 4a6b3be3-5b65-48e9-b0c3-3a2ca45dbbdd and timestamp: 2021-11-08T12:45:27.995000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11553.556
dataflow_v2_java11_total_bytes_count             1.80117038E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f5d957cb4f8eab9dfad32286fe173af315dd66545b7e0e7591f374b7a6d4ca3d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f5d957cb4f8eab9dfad32286fe173af315dd66545b7e0e7591f374b7a6d4ca3d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f5d957cb4f8eab9dfad32286fe173af315dd66545b7e0e7591f374b7a6d4ca3d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f5d957cb4f8eab9dfad32286fe173af315dd66545b7e0e7591f374b7a6d4ca3d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f5d957cb4f8eab9dfad32286fe173af315dd66545b7e0e7591f374b7a6d4ca3d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f5d957cb4f8eab9dfad32286fe173af315dd66545b7e0e7591f374b7a6d4ca3d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 46s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5ndm6qzatdbxm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 143 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 - Build # 143 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/143/ to view the results.

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #142

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/142/display/redirect?page=changes>

Changes:

[noreply] [adhoc] Speedup slow tests for AWS IO modules (#15899)

[noreply] [BEAM-12566] Implement set_axis for DataFrame and Series (#15773)

[noreply] [BEAM-13192] Fix buggy retry tests for AWS SnsIO (#15910)

[noreply] [BEAM-11217] Metrics Query filtering for DoFn metrics. (#15887)

[noreply] [Go SDK] Go SDK Exits Experimental (#15894)

[joseinigo] [BEAM-13080] Fix number of default keys

[joseinigo] [BEAM-13080] Fix number of default keys


------------------------------------------
[...truncated 48.60 KB...]
b54e21fc98b6: Preparing
80513f585ca8: Preparing
b306fc598105: Preparing
e1ea39e25414: Preparing
2c253a1ecb3d: Preparing
e823c04abb62: Preparing
f8506eb6d448: Preparing
66dbdac24a88: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
e823c04abb62: Waiting
2c253a1ecb3d: Waiting
e1ea39e25414: Waiting
f8506eb6d448: Waiting
80513f585ca8: Waiting
66dbdac24a88: Waiting
b306fc598105: Waiting
9f9f651e9303: Waiting
36e0782f1159: Waiting
78700b6b35d0: Waiting
0b3c02b5d746: Waiting
62a5b8741e83: Waiting
ba6e5ff31f23: Waiting
62a747bf1719: Waiting
b54e21fc98b6: Pushed
effff550acfb: Pushed
999d8bb81cd1: Pushed
79f8b53e4f15: Pushed
c966e6346529: Pushed
80513f585ca8: Pushed
e1ea39e25414: Pushed
2c253a1ecb3d: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
b306fc598105: Pushed
9f9f651e9303: Layer already exists
62a747bf1719: Layer already exists
0b3c02b5d746: Layer already exists
66dbdac24a88: Pushed
f8506eb6d448: Pushed
e823c04abb62: Pushed
20211106124333: digest: sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 06, 2021 12:45:32 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 06, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 06, 2021 12:45:33 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 06, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <109683 bytes, hash e0f99d2369055f65369016f95c646e02ac6e57f4db3c0df472216533e8cdd873> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4PmdI2kFX2U2kBb5XGRuAqxuV_TbPA30ciFlM-jN2HM.pb
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 06, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@617389a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1c8f6a90, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3050ac2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@265bd546, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1937eaff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e0bc8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0f2299, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33063f5b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99]
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 06, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c2dfa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@661d88a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0b64cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59ce792e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4860827a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@404db674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50f097b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7add838c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3662bdff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bb15351, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa822ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937]
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-06_05_45_38-12108461136062105938?project=apache-beam-testing
Nov 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-06_05_45_38-12108461136062105938
Nov 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-06_05_45_38-12108461136062105938
Nov 06, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-06T12:45:45.142Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-omxk. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 06, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.029Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.555Z: Expanding SplittableParDo operations into optimizable parts.
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.586Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.653Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.728Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.753Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.821Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.914Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.951Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.983Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.006Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.041Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.067Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.098Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.120Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.157Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.192Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.225Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.251Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.285Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.314Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.348Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.378Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.399Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.435Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 06, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.494Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 06, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.521Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 06, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.550Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 06, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.574Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 06, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.608Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 06, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:52.072Z: Starting 5 ****s in us-central1-a...
Nov 06, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:09.614Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 06, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:37.836Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 06, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:37.861Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Nov 06, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:48.163Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 06, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:47:36.437Z: Workers have started successfully.
Nov 06, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:47:36.463Z: Workers have started successfully.
Nov 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.044Z: Cancel request is committed for workflow job: 2021-11-06_05_45_38-12108461136062105938.
Nov 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.108Z: Cleaning up.
Nov 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.176Z: Stopping **** pool...
Nov 06, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.229Z: Stopping **** pool...
Nov 06, 2021 4:03:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:03:26.712Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 06, 2021 4:03:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:03:26.749Z: Worker pool stopped.
Nov 06, 2021 4:03:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-06_05_45_38-12108461136062105938 finished with status CANCELLED.
Load test results for test (ID): fe22fbdc-a19f-469e-adaa-aac694f6e264 and timestamp: 2021-11-06T12:45:33.244000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11523.914
dataflow_v2_java11_total_bytes_count             2.67386971E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211106124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211106124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211106124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 20m 18s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pryongyp37moa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #141

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/141/display/redirect?page=changes>

Changes:

[heejong] [BEAM-13021] Deduplicate Python artifact not only by hash but also by

[heejong] add test

[heejong] rename variables

[Luke Cwik] [BEAM-3811] Code clean-up of the CancelleableQueue to not throw

[heejong] rearrange comparison order

[Luke Cwik] [BEAM-13164] Address most of a race condition between instantiation and

[Robert Bradshaw] Defer filesToStage construction until after full jar resolution.

[heejong] s/pathes/paths/

[noreply] Merge pull request #15835: [BEAM-11205] Google Libraries BOM 24.0.0 and

[noreply] Merge pull request #15810: [BEAM-2791] Support low-latency StorageApi

[noreply] [BEAM-13081] Fixes a compatible issue of decoding null-value bitmap


------------------------------------------
[...truncated 49.66 KB...]
ba6e5ff31f23: Waiting
0b3c02b5d746: Waiting
379f31d19858: Waiting
eaf5029db1fb: Waiting
36e0782f1159: Waiting
fabf3b6d80fc: Pushed
a7c2e9d18f2a: Pushed
be23f8775916: Pushed
fe9147d6e839: Pushed
2eae6dabe400: Pushed
ad122fac4193: Pushed
eaf5029db1fb: Pushed
379f31d19858: Pushed
78700b6b35d0: Layer already exists
e967acfd455f: Pushed
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
6ab784edc93e: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
89278d4876d4: Pushed
e12820d25c5b: Pushed
20211105124339: digest: sha256:7c124b4b6a5f6433e10fb4a36068ef12a9045b1acb9378583a7864d4128fb619 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 05, 2021 12:45:44 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 05, 2021 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 197 files. Enable logging at DEBUG level to see which files will be staged.
Nov 05, 2021 12:45:46 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 05, 2021 12:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 05, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 05, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 seconds
Nov 05, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 05, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <109683 bytes, hash 53e97eec66dbdbeaed8338582d934c1ee4c30162f04d98769a3bade2b65c5a3f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-U-l-7Gbb2-rtgzhYLZNMHuTDAWLwTZh2mjut4rZcWj8.pb
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 05, 2021 12:45:51 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@617389a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1c8f6a90, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3050ac2f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@265bd546, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1937eaff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e0bc8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0f2299, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33063f5b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33634f04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4993febc]
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 05, 2021 12:45:51 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c2dfa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@661d88a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0b64cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59ce792e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4860827a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@404db674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50f097b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7add838c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3662bdff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bb15351, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa822ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ad1caa2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b6b3572]
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 05, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-05_05_45_52-15779773588238686739?project=apache-beam-testing
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-05_05_45_52-15779773588238686739
Nov 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-05_05_45_52-15779773588238686739
Nov 05, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-05T12:46:00.636Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-f0m2. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 05, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:05.064Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:05.705Z: Expanding SplittableParDo operations into optimizable parts.
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:05.738Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:05.800Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:05.869Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:05.896Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:05.966Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.080Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.111Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.135Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.174Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.201Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.222Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.246Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.282Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.311Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.344Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.365Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.390Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.412Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.432Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.464Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.497Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.530Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.563Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.591Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.620Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.687Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 05, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.745Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 05, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:06.825Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 05, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:07.363Z: Starting 5 ****s in us-central1-a...
Nov 05, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:26.364Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 05, 2021 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:46:58.100Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 05, 2021 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:47:52.174Z: Workers have started successfully.
Nov 05, 2021 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T12:47:52.239Z: Workers have started successfully.
Nov 05, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T16:00:40.085Z: Cancel request is committed for workflow job: 2021-11-05_05_45_52-15779773588238686739.
Nov 05, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T16:00:40.154Z: Cleaning up.
Nov 05, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T16:00:40.263Z: Stopping **** pool...
Nov 05, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T16:00:40.310Z: Stopping **** pool...
Nov 05, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T16:02:57.894Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 05, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-05T16:02:57.928Z: Worker pool stopped.
Nov 05, 2021 4:03:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-05_05_45_52-15779773588238686739 finished with status CANCELLED.
Load test results for test (ID): fdcfcbf3-53b2-4585-abe6-ae67c4eaec3b and timestamp: 2021-11-05T12:45:45.675000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11545.383
dataflow_v2_java11_total_bytes_count             2.06464136E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211105124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7c124b4b6a5f6433e10fb4a36068ef12a9045b1acb9378583a7864d4128fb619
Deleted: sha256:eb95e0312ce127d3b9e8f8485f70e5368fb345db908e57c7d46ea92d3d5c8091
Deleted: sha256:db1ccc2876864aea6b16892a501b24c7326915438d014040506157ad8b5c56de
Deleted: sha256:f410e6253d7aa3b307b1e3bd2dc67794e5817bf4f0618d6b07aaad87eefa56c0
Deleted: sha256:7aec492d546a60b380484e7dbd800744e71f3f8477317a7619b24eb1a84da9d5
Deleted: sha256:6dcc2cbe661ad50072415c7cec4a6e263ae43572b8b27554e8e9ab453556a77d
Deleted: sha256:65da22f6d1b7d948392bf88292c4f90c7147dfc6a29850ff9706d5fb6fda9fb6
Deleted: sha256:67d81025792b008c535cbf03f13b3477ab7a7b0c88496d9810ed1d89f2d4e040
Deleted: sha256:539fd19ece85bd38f0923f31e8c6fbb1809d3a17dcb8dabaddac242bb8577d5f
Deleted: sha256:591038e15cda4123dc7518edfc8367161c86d75a160410b2cf0f5b7ae3d23a49
Deleted: sha256:bd4d7b484734e4db207d5b844e79900da4867f2f8974e44a0bbe2e31b9a11ee2
Deleted: sha256:db57a098f64e865ef93269b97848973156e108fe69938f254404bdd9b705b5fc
Deleted: sha256:21738747e6f0793337ff0d241e85f3ebf58e8fc9ab43fc9bef93919f76e46582
Deleted: sha256:91c2d19acdd1e1b10a7d7ef38b85c16deac8d8e8b2a19324534eb376662a897a
Deleted: sha256:4d5b0cc523b2100f02312d099d6ab08ac2928c1fd6f996799d42212767e9ba07
Deleted: sha256:948f43fde6337fb52f36de81e8df7fe0acbf849e71d030233d2bf15b8c2ac2d1
Deleted: sha256:51edd63245e6a97eba5e8dc4705d08e7d4e7a7945a6d99c56269144aa31ed41f
Deleted: sha256:d3a83747995e35445c4e0be18dc442dfaa3f528cee9319f0a27ad4ad40aa7adc
Deleted: sha256:c97c3fecea151198a661ed096e616dc129ce034bfe1c94b1cc808f1539fa5773
Deleted: sha256:4cda81f2c6642e6eee1dc682ce57c945b2f8a4dfa499efcb37d6454b008fcc56
Deleted: sha256:5ea5b897c81f256fd999bfed9d6fba8c3821b58da93a92cee3af61aff1541582
Deleted: sha256:57855ec99b059f068754a683fb257e90eceee4d90d27a3ac71f814fde051f411
Deleted: sha256:e361b4f486f4d8dafc7f03401a9aaf862894cd8539f211363ef234465ed0ab83
Deleted: sha256:e70a67bfa95bf22ea7e0dc62e79e03645645e0ccbcc9e8adddf22cfb7d44a0f9
Deleted: sha256:d8454c7162a6893e31c95b19f828e27e60c7e73e0a240086c2ae022e05b6e797
Deleted: sha256:d09c1f1dabf3e70f651607b87b74ffd45634cee733f2878d92bcd25d4d88cca5
Deleted: sha256:fb3fd05eff157eb2b14ba630a1afbea81dae6600afba98dc18642b82101dbc74
Deleted: sha256:188c8bb34ba2be4e4a372caf7278cea5787996678973d369ba4c5b71c0b5e7c4
Deleted: sha256:ad1ab5724299fb8f9e7190385ca7d78d2fe0080969089571b4c49bfeac0fedb1
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211105124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7c124b4b6a5f6433e10fb4a36068ef12a9045b1acb9378583a7864d4128fb619]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211105124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7c124b4b6a5f6433e10fb4a36068ef12a9045b1acb9378583a7864d4128fb619])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7c124b4b6a5f6433e10fb4a36068ef12a9045b1acb9378583a7864d4128fb619
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7c124b4b6a5f6433e10fb4a36068ef12a9045b1acb9378583a7864d4128fb619
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7c124b4b6a5f6433e10fb4a36068ef12a9045b1acb9378583a7864d4128fb619].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 48s
101 actionable tasks: 72 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2jwgvbmclkgme

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #140

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/140/display/redirect?page=changes>

Changes:

[noreply] Add missing parentheses for Python test example

[noreply] Update test input and imports

[stranniknm] [BEAM-13105] playground - add shorcut hint tooltip

[mmack] [BEAM-11440] Add integration test for KinesisIO using Localstack on

[mmack] Bump localstack container version for more robust version with respect

[mmack] Bump localstack container version for DynamoDBIOTest to version used in

[mmack] Apply format

[mmack] Apply format

[mmack] Use vendored guava deps

[25622840+adude3141] [BEAM-13157] support hadoop configuration on ParquetIO.Parse

[Pablo Estrada] JdbcIO has a single WriteFn underlying all implementations

[dpcollins] Performance improvement to PubSubLiteIO to not use a streaming committer

[noreply] Merge pull request #15854 from [BEAM-13046][Playground] protobuf

[noreply] [BEAM-12550] Parallelizable skew Implementation  (#15809)

[noreply] Merge pull request #15782 from [BEAM-13034] [Playground] add semantics

[hlagosperez] Use BigQuieryIO.loadProjectId  in WriteRename class to create

[noreply] [BEAM-13099] Use vendored Calcite 1.28.0 in SQL extensions (#15836)

[Luke Cwik] [BEAM-13015] Use a network based channel instead of an inmemory one

[Pablo Estrada] Addressing comments

[noreply] Merge pull request #15839 from [BEAM-13041][Playground] Prepare files

[noreply] [BEAM-13001] updated CHANGES.md to include msec counter for Go (#15872)

[noreply] Add window mapping to CHANGES.md (#15871)

[noreply] Merge pull request #15784 from [BEAM-8135]  - Removing

[noreply] [BEAM-13119] Subdirectory prefix tag for Go SDK (#15881)

[noreply] Merge pull request #15852 from [BEAM-13102] [Playground] update

[Kyle Weaver] Fix typo: s/spark/twister2


------------------------------------------
[...truncated 232.95 KB...]
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:13.732Z: Staged package metrics-core-3.1.2-JFuipmqbxxDOTbFHERJud7y05tlu9-YiZZKA88kMu1w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/metrics-core-3.1.2-JFuipmqbxxDOTbFHERJud7y05tlu9-YiZZKA88kMu1w.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.075Z: Staged package proto-google-cloud-bigquerystorage-v1-2.1.0--M8Hmi82bEAs_rlZfGOmOJeniDwYV9GXd4uYuCIzxIg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1-2.1.0--M8Hmi82bEAs_rlZfGOmOJeniDwYV9GXd4uYuCIzxIg.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.112Z: Staged package proto-google-cloud-bigquerystorage-v1beta1-0.125.0-qKRRBewW8WdLWXjsY9uGABgBUBdfwqmhSwqFkWRQV1Q.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1beta1-0.125.0-qKRRBewW8WdLWXjsY9uGABgBUBdfwqmhSwqFkWRQV1Q.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.173Z: Staged package proto-google-cloud-bigquerystorage-v1beta2-0.125.0-JSL7XosXdUS5ZGraL_j5Dnbf6dY04ZrPTdxB_T_NsNc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1beta2-0.125.0-JSL7XosXdUS5ZGraL_j5Dnbf6dY04ZrPTdxB_T_NsNc.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.209Z: Staged package proto-google-cloud-bigtable-admin-v2-2.1.0-Oha95NdNmvlfAh6KTLXm9FygpotYhLsS7Cb837ZbIJo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigtable-admin-v2-2.1.0-Oha95NdNmvlfAh6KTLXm9FygpotYhLsS7Cb837ZbIJo.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.246Z: Staged package proto-google-cloud-bigtable-v2-2.1.0-BA4Xw2RDAfoRzQCX0ac5VopZzaDym4fSBY0Msbm8txM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigtable-v2-2.1.0-BA4Xw2RDAfoRzQCX0ac5VopZzaDym4fSBY0Msbm8txM.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.285Z: Staged package proto-google-cloud-datastore-v1-0.91.3-vS4xd5n4PaDd-Kyt6_V3G8YGqO_QMcuShghC9MLpSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-datastore-v1-0.91.3-vS4xd5n4PaDd-Kyt6_V3G8YGqO_QMcuShghC9MLpSns.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.332Z: Staged package proto-google-cloud-firestore-bundle-v1-3.0.2-vr6gpK2KOX9hYv8tzTr4-RsKC2t_wRcUKeLYKqLm2oQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-firestore-bundle-v1-3.0.2-vr6gpK2KOX9hYv8tzTr4-RsKC2t_wRcUKeLYKqLm2oQ.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.386Z: Staged package proto-google-cloud-firestore-v1-3.0.2-e7pBKXoawrVbDlBanNo5GpdjMFfUfMropSFYGm-RA6I.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-firestore-v1-3.0.2-e7pBKXoawrVbDlBanNo5GpdjMFfUfMropSFYGm-RA6I.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.438Z: Staged package proto-google-cloud-pubsub-v1-1.96.2-Hd5TFaDg9u0PWkcRBAM7lW0P0qLQO4y5u-JvPoVBYsQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-pubsub-v1-1.96.2-Hd5TFaDg9u0PWkcRBAM7lW0P0qLQO4y5u-JvPoVBYsQ.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.668Z: Staged package proto-google-cloud-spanner-admin-database-v1-6.12.1-i6wlp_PYFwGObAmINb8JcCzH8AbDpzYSVJyQTC4R8EA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-admin-database-v1-6.12.1-i6wlp_PYFwGObAmINb8JcCzH8AbDpzYSVJyQTC4R8EA.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.713Z: Staged package proto-google-cloud-spanner-admin-instance-v1-6.12.1-BFVz7rtnZfKL0xslKUB_uKkNZhJX-EMm-G8-R_A1hQA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-admin-instance-v1-6.12.1-BFVz7rtnZfKL0xslKUB_uKkNZhJX-EMm-G8-R_A1hQA.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.763Z: Staged package proto-google-cloud-spanner-v1-6.12.1-s4xvKx28n2TpL6buW2o5wxJ_2nxIGjAN4WHs_WXhCu0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-v1-6.12.1-s4xvKx28n2TpL6buW2o5wxJ_2nxIGjAN4WHs_WXhCu0.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.830Z: Staged package proto-google-common-protos-2.3.2-maqGpeUt_1i-QQtcMJBbLqmiyZWofiN_FgWQlTxmG20.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-common-protos-2.3.2-maqGpeUt_1i-QQtcMJBbLqmiyZWofiN_FgWQlTxmG20.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.869Z: Staged package proto-google-iam-v1-1.0.14-5yLo1nFyqHrRtb07oCxnhISVgZDEnAWUlEU7B-fqXco.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-iam-v1-1.0.14-5yLo1nFyqHrRtb07oCxnhISVgZDEnAWUlEU7B-fqXco.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.908Z: Staged package protobuf-java-3.17.3-SsVJsZJpQUGVgEnwYKHIJqMzQvYZ4QjO2MF9mHf14-0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/protobuf-java-3.17.3-SsVJsZJpQUGVgEnwYKHIJqMzQvYZ4QjO2MF9mHf14-0.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:49:14.954Z: Staged package protobuf-java-util-3.17.3-vzIO0HYADh2MfL92AbBWrK7KuA91uaZZufY5jQ1-P3k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/protobuf-java-util-3.17.3-vzIO0HYADh2MfL92AbBWrK7KuA91uaZZufY5jQ1-P3k.jar' is inaccessible.
Nov 04, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-04T15:49:15.218Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 04, 2021 3:52:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-04T15:52:15.286Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 04, 2021 3:55:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:08.198Z: Staged package api-common-2.0.1-g8yqFlAvDwxYbosBUrWoK3ZxW93EQzdvb03u7M2bUvE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/api-common-2.0.1-g8yqFlAvDwxYbosBUrWoK3ZxW93EQzdvb03u7M2bUvE.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:08.991Z: Staged package bigtable-client-core-1.23.1-xSDVAE9Xpy4qeW4--vFAYxs7AZ8EspP5_1fdu3yIbio.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/bigtable-client-core-1.23.1-xSDVAE9Xpy4qeW4--vFAYxs7AZ8EspP5_1fdu3yIbio.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.038Z: Staged package bigtable-metrics-api-1.23.1-TdnTs_hKYdupIMUGd2wTnqTzdMVhrfwI5ciftUxIsTI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/bigtable-metrics-api-1.23.1-TdnTs_hKYdupIMUGd2wTnqTzdMVhrfwI5ciftUxIsTI.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.220Z: Staged package commons-codec-1.15-s-n21jp5AQm_DQVmEfvtHPaQVYJt7-uYlKcTadJG7WM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-codec-1.15-s-n21jp5AQm_DQVmEfvtHPaQVYJt7-uYlKcTadJG7WM.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.602Z: Staged package gax-2.3.0-eMOD-8BQkNf6hb3BUVVu0XlCJ625EV96u1iWoYhJDkE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gax-2.3.0-eMOD-8BQkNf6hb3BUVVu0XlCJ625EV96u1iWoYhJDkE.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.653Z: Staged package gax-grpc-2.3.0-go68qqPEG0mDW_ITj2MCF6xg1flj0dbLagP6N-meEcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gax-grpc-2.3.0-go68qqPEG0mDW_ITj2MCF6xg1flj0dbLagP6N-meEcE.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.698Z: Staged package gax-httpjson-0.88.0-lHShTFD8TLYZ6Og3AlxdxDNaz-i-zOzHxXH621Tfr7I.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gax-httpjson-0.88.0-lHShTFD8TLYZ6Og3AlxdxDNaz-i-zOzHxXH621Tfr7I.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.774Z: Staged package google-api-client-1.32.1-d63CrvrOT8kqaYuvoPi6txarBRuyHLQQYA9d4qfmsw4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-client-1.32.1-d63CrvrOT8kqaYuvoPi6txarBRuyHLQQYA9d4qfmsw4.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.828Z: Staged package google-api-client-jackson2-1.32.1-IHUWP1-fG6iVCC3WHq3xXUQivJT6Do-E9uGIMkVniWI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-client-jackson2-1.32.1-IHUWP1-fG6iVCC3WHq3xXUQivJT6Do-E9uGIMkVniWI.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.901Z: Staged package google-api-services-bigquery-v2-rev20210813-1.32.1-7nw9MuSxBly9ryv9a52-E2PAUKahjZ3uDSELjX39KWY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-bigquery-v2-rev20210813-1.32.1-7nw9MuSxBly9ryv9a52-E2PAUKahjZ3uDSELjX39KWY.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:09.950Z: Staged package google-api-services-clouddebugger-v2-rev20210813-1.32.1-tA91lCc58HMAryr4ZImcEVZcXxRVeUcBPdhsp2hGRLg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-clouddebugger-v2-rev20210813-1.32.1-tA91lCc58HMAryr4ZImcEVZcXxRVeUcBPdhsp2hGRLg.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.005Z: Staged package google-api-services-cloudresourcemanager-v1-rev20210815-1.32.1-jjiHPuwSpPBmqZ5Jn8DVFaSBpRRY1kfbzRQj37XWFYo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-cloudresourcemanager-v1-rev20210815-1.32.1-jjiHPuwSpPBmqZ5Jn8DVFaSBpRRY1kfbzRQj37XWFYo.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.083Z: Staged package google-api-services-dataflow-v1b3-rev20210818-1.32.1-BnHDq1zo3n0xyJIGrH3BrFg_puCQ1gikxyFLKnc01BA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-dataflow-v1b3-rev20210818-1.32.1-BnHDq1zo3n0xyJIGrH3BrFg_puCQ1gikxyFLKnc01BA.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.127Z: Staged package google-api-services-healthcare-v1-rev20210806-1.32.1-SnySyLXRzcgtws4uL-LdxBk9pXHKSkqALaUpVX24lCk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-healthcare-v1-rev20210806-1.32.1-SnySyLXRzcgtws4uL-LdxBk9pXHKSkqALaUpVX24lCk.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.207Z: Staged package google-api-services-pubsub-v1-rev20210809-1.32.1-Tr6gARITJUuN7ntkiL7S7o0i4J7WSYy-wzGrzZKhk2Y.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-pubsub-v1-rev20210809-1.32.1-Tr6gARITJUuN7ntkiL7S7o0i4J7WSYy-wzGrzZKhk2Y.jar' is inaccessible.
Nov 04, 2021 3:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.248Z: Staged package google-api-services-storage-v1-rev20210127-1.32.1-wjvrBbuEKr7RTI_XWj5rei9RsNPs4ZA6nF_Tq3cK2T4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-storage-v1-rev20210127-1.32.1-wjvrBbuEKr7RTI_XWj5rei9RsNPs4ZA6nF_Tq3cK2T4.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.370Z: Staged package google-auth-library-credentials-1.1.0-bpL3c79EMdarUbVBnhFrDdMakBZ35P-W3M5I9PK3Z3g.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-auth-library-credentials-1.1.0-bpL3c79EMdarUbVBnhFrDdMakBZ35P-W3M5I9PK3Z3g.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.524Z: Staged package google-auth-library-oauth2-http-1.1.0-wj6_4-5nU0FD-eJQlSbU7vu4BtTOUYB1SmxyH6lBwYY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-auth-library-oauth2-http-1.1.0-wj6_4-5nU0FD-eJQlSbU7vu4BtTOUYB1SmxyH6lBwYY.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.684Z: Staged package google-cloud-bigquery-2.1.4-acRfQOzlj8IvHdpxBQzjvS62zx7fJX4QTffgCZNrmY0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigquery-2.1.4-acRfQOzlj8IvHdpxBQzjvS62zx7fJX4QTffgCZNrmY0.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.779Z: Staged package google-cloud-bigquerystorage-2.1.0-rTRf2sdrbajSOilVmucmxLU9KIb3JYthfFcGKp7Tlfw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigquerystorage-2.1.0-rTRf2sdrbajSOilVmucmxLU9KIb3JYthfFcGKp7Tlfw.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.889Z: Staged package google-cloud-bigtable-2.1.0-CK5V53ksW3WZ9J4Bld4e1RwmvP3YnfFt5ftg3_Li5Z0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigtable-2.1.0-CK5V53ksW3WZ9J4Bld4e1RwmvP3YnfFt5ftg3_Li5Z0.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:10.989Z: Staged package google-cloud-core-2.1.0-V-_lolxckO-O1jfRQnLh7h3dFdZzvH61DyWVuUtsobc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-core-2.1.0-V-_lolxckO-O1jfRQnLh7h3dFdZzvH61DyWVuUtsobc.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.042Z: Staged package google-cloud-core-grpc-2.1.0-5vUOMd-3N4ETe4Usmj7x_SySNTbkMqFNyVKZBe2YEVc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-core-grpc-2.1.0-5vUOMd-3N4ETe4Usmj7x_SySNTbkMqFNyVKZBe2YEVc.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.107Z: Staged package google-cloud-core-http-2.1.0-RGVkQiv87L4ZbuLar_XW_aYIoGpacuBbYwLg1QLedYo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-core-http-2.1.0-RGVkQiv87L4ZbuLar_XW_aYIoGpacuBbYwLg1QLedYo.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.169Z: Staged package google-cloud-firestore-3.0.2-tKeDy091f9U7vlOalQx0yqjzXhBiwT4GkrjzBW2BCws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-firestore-3.0.2-tKeDy091f9U7vlOalQx0yqjzXhBiwT4GkrjzBW2BCws.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.216Z: Staged package google-cloud-pubsub-1.114.2-4kM0U8-6GadXz75swlcehVyseIl4rSzBPW0XvlAlEB8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-pubsub-1.114.2-4kM0U8-6GadXz75swlcehVyseIl4rSzBPW0XvlAlEB8.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.298Z: Staged package google-cloud-spanner-6.12.1-9r3EtK4WZZ485rU7NAfKI7xhEyFZ1ApnZ7gAUElnMe4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-spanner-6.12.1-9r3EtK4WZZ485rU7NAfKI7xhEyFZ1ApnZ7gAUElnMe4.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.497Z: Staged package grpc-alts-1.40.0-YwW5drbCb3_Wyf9z2n6bFZ5X1T7_FigR7di02444Sk0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-alts-1.40.0-YwW5drbCb3_Wyf9z2n6bFZ5X1T7_FigR7di02444Sk0.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.555Z: Staged package grpc-api-1.40.0-6JlsF6D_ZmXDRj9oACWaN1WqPUhjxdUXN7k7EegYoL0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-api-1.40.0-6JlsF6D_ZmXDRj9oACWaN1WqPUhjxdUXN7k7EegYoL0.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.605Z: Staged package grpc-auth-1.40.0-PoW7NsnBJ2_XWeC17cLgxQx16n0N7jZaneUMJPIz9UM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-auth-1.40.0-PoW7NsnBJ2_XWeC17cLgxQx16n0N7jZaneUMJPIz9UM.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.660Z: Staged package grpc-context-1.40.0-MYgqv87MjQnKh6T1FEFMOr4NjNKmKzeSSetW1j7bmXQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-context-1.40.0-MYgqv87MjQnKh6T1FEFMOr4NjNKmKzeSSetW1j7bmXQ.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.711Z: Staged package grpc-core-1.40.0-jXEll3JqBHjtCl4FzFZi4aa3ue--LVhdQ8lH7JQnW4s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-core-1.40.0-jXEll3JqBHjtCl4FzFZi4aa3ue--LVhdQ8lH7JQnW4s.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.766Z: Staged package grpc-gcp-1.1.0-zKq--BAvLo2Cks8hjNm7iNASsOWFeFMqwc55hcc1YEY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-gcp-1.1.0-zKq--BAvLo2Cks8hjNm7iNASsOWFeFMqwc55hcc1YEY.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.836Z: Staged package grpc-google-cloud-bigquerystorage-v1-2.1.0-pmqLBn9jJX6cMBQNRZqEOdSbrNClk757IV6iPV2sc-E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigquerystorage-v1-2.1.0-pmqLBn9jJX6cMBQNRZqEOdSbrNClk757IV6iPV2sc-E.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.880Z: Staged package grpc-google-cloud-bigquerystorage-v1beta1-0.125.0-lie7DrgZCGZoIdynaS5DtII9wwrPhUmrclc0mC1pRJ0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigquerystorage-v1beta1-0.125.0-lie7DrgZCGZoIdynaS5DtII9wwrPhUmrclc0mC1pRJ0.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.935Z: Staged package grpc-google-cloud-bigquerystorage-v1beta2-0.125.0-zOXapRF30ZRCTchndCTgYjr297vPZNoLFuJiRrg4rqE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigquerystorage-v1beta2-0.125.0-zOXapRF30ZRCTchndCTgYjr297vPZNoLFuJiRrg4rqE.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:11.973Z: Staged package grpc-google-cloud-bigtable-admin-v2-2.1.0-O9M5vtkkhXZ1hJGrv61OGCaET60iZucVgfx6SVbGMo8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigtable-admin-v2-2.1.0-O9M5vtkkhXZ1hJGrv61OGCaET60iZucVgfx6SVbGMo8.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.026Z: Staged package grpc-google-cloud-bigtable-v2-2.1.0-xRpHCbcaKPpU2xjJ4uKK1sDd2cpQMum0QacDJEpb2wg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-bigtable-v2-2.1.0-xRpHCbcaKPpU2xjJ4uKK1sDd2cpQMum0QacDJEpb2wg.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.082Z: Staged package grpc-google-cloud-pubsub-v1-1.96.2-LSfgPfeyJMuoujfLiVJGuAcqKQh1TjDhBNFwA-L3s1I.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-pubsub-v1-1.96.2-LSfgPfeyJMuoujfLiVJGuAcqKQh1TjDhBNFwA-L3s1I.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.147Z: Staged package grpc-google-cloud-spanner-admin-database-v1-6.12.1-unPYqI58YhKv_6mc6T6YQgqBcvtRPSdGpo1hXi2mQQc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-spanner-admin-database-v1-6.12.1-unPYqI58YhKv_6mc6T6YQgqBcvtRPSdGpo1hXi2mQQc.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.199Z: Staged package grpc-google-cloud-spanner-admin-instance-v1-6.12.1-vVVDW3Mzs5VrMu-ONFpIupQowCosX0kgctveASt1T_M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-spanner-admin-instance-v1-6.12.1-vVVDW3Mzs5VrMu-ONFpIupQowCosX0kgctveASt1T_M.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.244Z: Staged package grpc-google-cloud-spanner-v1-6.12.1-sTZfm0NLp3Dt4c16Ot_ONtv-qQgCVvddmp8P56UjCgs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-cloud-spanner-v1-6.12.1-sTZfm0NLp3Dt4c16Ot_ONtv-qQgCVvddmp8P56UjCgs.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.315Z: Staged package grpc-google-common-protos-2.3.2-027cVEqmfFtfw0r7Y8evdsYc_NET1FNnWGqtGva-nn8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-google-common-protos-2.3.2-027cVEqmfFtfw0r7Y8evdsYc_NET1FNnWGqtGva-nn8.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.358Z: Staged package grpc-grpclb-1.40.0-3x_pdrKP1JFP5mkjhs5ctpBStgFD0Av1ls3HyVryo0M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-grpclb-1.40.0-3x_pdrKP1JFP5mkjhs5ctpBStgFD0Av1ls3HyVryo0M.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.411Z: Staged package grpc-netty-1.40.0-KI8g3du3gxngQ-zUfmn47sHGrI-vrl_7QxMVy8tV0R0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-netty-1.40.0-KI8g3du3gxngQ-zUfmn47sHGrI-vrl_7QxMVy8tV0R0.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.474Z: Staged package grpc-netty-shaded-1.40.0-bfX9Qgk5ZMZ_L1OTR0Izgy70u2hNSvIXKHItGr9w9Z4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-netty-shaded-1.40.0-bfX9Qgk5ZMZ_L1OTR0Izgy70u2hNSvIXKHItGr9w9Z4.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.524Z: Staged package grpc-protobuf-1.40.0-9lmDVCdqFREyDkUqGEg3MmMsmnOiNyuewKZsmoJI8pg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-protobuf-1.40.0-9lmDVCdqFREyDkUqGEg3MmMsmnOiNyuewKZsmoJI8pg.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.562Z: Staged package grpc-protobuf-lite-1.40.1-wX_o9Paj_2Dy3BxRGYFcW8vMOWfmCrHlOCckWeT9aac.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-protobuf-lite-1.40.1-wX_o9Paj_2Dy3BxRGYFcW8vMOWfmCrHlOCckWeT9aac.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.605Z: Staged package grpc-stub-1.40.0--7XO3mWD78nDt0upNPSfu4LJ8OX52rRbz7Lxg1wFRcs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/grpc-stub-1.40.0--7XO3mWD78nDt0upNPSfu4LJ8OX52rRbz7Lxg1wFRcs.jar' is inaccessible.
Nov 04, 2021 3:55:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.669Z: Staged package guava-30.1.1-jre-RM4inOJtiAvzr8Niu_zsNNfmkD0ZW7sdufO24NmDTwY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/guava-30.1.1-jre-RM4inOJtiAvzr8Niu_zsNNfmkD0ZW7sdufO24NmDTwY.jar' is inaccessible.
Nov 04, 2021 3:55:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.766Z: Staged package httpclient-4.5.13-b-kCalZsalABYIzz_DIZZkH2weXhmG0QN8zb1fMe90M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/httpclient-4.5.13-b-kCalZsalABYIzz_DIZZkH2weXhmG0QN8zb1fMe90M.jar' is inaccessible.
Nov 04, 2021 3:55:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:12.846Z: Staged package httpcore-4.4.14--VYgnkUMsdDFF3bfvSPlPp3Y25oSmO1itwvwlEumOyg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/httpcore-4.4.14--VYgnkUMsdDFF3bfvSPlPp3Y25oSmO1itwvwlEumOyg.jar' is inaccessible.
Nov 04, 2021 3:55:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:13.006Z: Staged package jackson-annotations-2.12.4-9qo3Bqh1aJtmzawzNPZd_beVzPrUEXvwcok7GW7R7I4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-annotations-2.12.4-9qo3Bqh1aJtmzawzNPZd_beVzPrUEXvwcok7GW7R7I4.jar' is inaccessible.
Nov 04, 2021 3:55:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:13.052Z: Staged package jackson-core-2.12.4-NQbOR-wmBK4tgNeVBffLN09xgGBjlBXAfRRK2t0taKM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-2.12.4-NQbOR-wmBK4tgNeVBffLN09xgGBjlBXAfRRK2t0taKM.jar' is inaccessible.
Nov 04, 2021 3:55:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:13.155Z: Staged package jackson-databind-2.12.4-6Zp7S4kHS8aJqrzZ6x8sExi2jMXDSXna8-NO3FWMegE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-databind-2.12.4-6Zp7S4kHS8aJqrzZ6x8sExi2jMXDSXna8-NO3FWMegE.jar' is inaccessible.
Nov 04, 2021 3:55:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:13.221Z: Staged package jackson-dataformat-yaml-2.12.4--CQtUTyPlCxyPdbUU6o76-9mn6Y4PKEIIDPvr5FbTdQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-yaml-2.12.4--CQtUTyPlCxyPdbUU6o76-9mn6Y4PKEIIDPvr5FbTdQ.jar' is inaccessible.
Nov 04, 2021 3:55:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:13.684Z: Staged package metrics-core-3.1.2-JFuipmqbxxDOTbFHERJud7y05tlu9-YiZZKA88kMu1w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/metrics-core-3.1.2-JFuipmqbxxDOTbFHERJud7y05tlu9-YiZZKA88kMu1w.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.219Z: Staged package proto-google-cloud-bigquerystorage-v1-2.1.0--M8Hmi82bEAs_rlZfGOmOJeniDwYV9GXd4uYuCIzxIg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1-2.1.0--M8Hmi82bEAs_rlZfGOmOJeniDwYV9GXd4uYuCIzxIg.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.259Z: Staged package proto-google-cloud-bigquerystorage-v1beta1-0.125.0-qKRRBewW8WdLWXjsY9uGABgBUBdfwqmhSwqFkWRQV1Q.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1beta1-0.125.0-qKRRBewW8WdLWXjsY9uGABgBUBdfwqmhSwqFkWRQV1Q.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.330Z: Staged package proto-google-cloud-bigquerystorage-v1beta2-0.125.0-JSL7XosXdUS5ZGraL_j5Dnbf6dY04ZrPTdxB_T_NsNc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigquerystorage-v1beta2-0.125.0-JSL7XosXdUS5ZGraL_j5Dnbf6dY04ZrPTdxB_T_NsNc.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.368Z: Staged package proto-google-cloud-bigtable-admin-v2-2.1.0-Oha95NdNmvlfAh6KTLXm9FygpotYhLsS7Cb837ZbIJo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigtable-admin-v2-2.1.0-Oha95NdNmvlfAh6KTLXm9FygpotYhLsS7Cb837ZbIJo.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.418Z: Staged package proto-google-cloud-bigtable-v2-2.1.0-BA4Xw2RDAfoRzQCX0ac5VopZzaDym4fSBY0Msbm8txM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-bigtable-v2-2.1.0-BA4Xw2RDAfoRzQCX0ac5VopZzaDym4fSBY0Msbm8txM.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.466Z: Staged package proto-google-cloud-datastore-v1-0.91.3-vS4xd5n4PaDd-Kyt6_V3G8YGqO_QMcuShghC9MLpSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-datastore-v1-0.91.3-vS4xd5n4PaDd-Kyt6_V3G8YGqO_QMcuShghC9MLpSns.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.514Z: Staged package proto-google-cloud-firestore-bundle-v1-3.0.2-vr6gpK2KOX9hYv8tzTr4-RsKC2t_wRcUKeLYKqLm2oQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-firestore-bundle-v1-3.0.2-vr6gpK2KOX9hYv8tzTr4-RsKC2t_wRcUKeLYKqLm2oQ.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.554Z: Staged package proto-google-cloud-firestore-v1-3.0.2-e7pBKXoawrVbDlBanNo5GpdjMFfUfMropSFYGm-RA6I.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-firestore-v1-3.0.2-e7pBKXoawrVbDlBanNo5GpdjMFfUfMropSFYGm-RA6I.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.602Z: Staged package proto-google-cloud-pubsub-v1-1.96.2-Hd5TFaDg9u0PWkcRBAM7lW0P0qLQO4y5u-JvPoVBYsQ.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-pubsub-v1-1.96.2-Hd5TFaDg9u0PWkcRBAM7lW0P0qLQO4y5u-JvPoVBYsQ.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.665Z: Staged package proto-google-cloud-spanner-admin-database-v1-6.12.1-i6wlp_PYFwGObAmINb8JcCzH8AbDpzYSVJyQTC4R8EA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-admin-database-v1-6.12.1-i6wlp_PYFwGObAmINb8JcCzH8AbDpzYSVJyQTC4R8EA.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.714Z: Staged package proto-google-cloud-spanner-admin-instance-v1-6.12.1-BFVz7rtnZfKL0xslKUB_uKkNZhJX-EMm-G8-R_A1hQA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-admin-instance-v1-6.12.1-BFVz7rtnZfKL0xslKUB_uKkNZhJX-EMm-G8-R_A1hQA.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.757Z: Staged package proto-google-cloud-spanner-v1-6.12.1-s4xvKx28n2TpL6buW2o5wxJ_2nxIGjAN4WHs_WXhCu0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-cloud-spanner-v1-6.12.1-s4xvKx28n2TpL6buW2o5wxJ_2nxIGjAN4WHs_WXhCu0.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.810Z: Staged package proto-google-common-protos-2.3.2-maqGpeUt_1i-QQtcMJBbLqmiyZWofiN_FgWQlTxmG20.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-common-protos-2.3.2-maqGpeUt_1i-QQtcMJBbLqmiyZWofiN_FgWQlTxmG20.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.846Z: Staged package proto-google-iam-v1-1.0.14-5yLo1nFyqHrRtb07oCxnhISVgZDEnAWUlEU7B-fqXco.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/proto-google-iam-v1-1.0.14-5yLo1nFyqHrRtb07oCxnhISVgZDEnAWUlEU7B-fqXco.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.902Z: Staged package protobuf-java-3.17.3-SsVJsZJpQUGVgEnwYKHIJqMzQvYZ4QjO2MF9mHf14-0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/protobuf-java-3.17.3-SsVJsZJpQUGVgEnwYKHIJqMzQvYZ4QjO2MF9mHf14-0.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-04T15:55:14.949Z: Staged package protobuf-java-util-3.17.3-vzIO0HYADh2MfL92AbBWrK7KuA91uaZZufY5jQ1-P3k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/protobuf-java-util-3.17.3-vzIO0HYADh2MfL92AbBWrK7KuA91uaZZufY5jQ1-P3k.jar' is inaccessible.
Nov 04, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-04T15:55:15.229Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 04, 2021 3:58:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-04T15:58:12.604Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 04, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-04T16:00:37.494Z: Cancel request is committed for workflow job: 2021-11-04_05_45_34-7043171245915014881.
Nov 04, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-04T16:00:37.556Z: Cleaning up.
Nov 04, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-04T16:00:37.637Z: Stopping **** pool...
Nov 04, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-04T16:00:37.681Z: Stopping **** pool...
Nov 04, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-04T16:02:54.634Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 04, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-04T16:02:54.684Z: Worker pool stopped.
Nov 04, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-04_05_45_34-7043171245915014881 finished with status CANCELLED.
Load test results for test (ID): 1bde0e43-cd5b-4b7f-adcd-ae88e876fc0a and timestamp: 2021-11-04T12:45:29.078000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11477.704
dataflow_v2_java11_total_bytes_count             1.52906431E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211104124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6c5da4ee64ea8d299cc29d8f441c82f5069a9c3d4a58e4c56e2947aa39c43766
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211104124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6c5da4ee64ea8d299cc29d8f441c82f5069a9c3d4a58e4c56e2947aa39c43766]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211104124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6c5da4ee64ea8d299cc29d8f441c82f5069a9c3d4a58e4c56e2947aa39c43766])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6c5da4ee64ea8d299cc29d8f441c82f5069a9c3d4a58e4c56e2947aa39c43766
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6c5da4ee64ea8d299cc29d8f441c82f5069a9c3d4a58e4c56e2947aa39c43766
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6c5da4ee64ea8d299cc29d8f441c82f5069a9c3d4a58e4c56e2947aa39c43766].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 47s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5maektvgqyias

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #139

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/139/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-13118] Use branch for nightly "tag", tags are immutable

[dmitrii_kuzin] [BEAM-12730] Python. Custom delimiter add corner case

[Alexey Romanenko] [BEAM-12070] ParquetIO: use splittable reading by default

[noreply] [BEAM-13001] fixes nil reference error in Extractor.ExtractFrom (#15865)

[noreply] Merge pull request #15842 from [BEAM-13125][Playground] Update

[noreply] Merge pull request #15838 from [BEAM-13127] [Playground] Implement TCP

[noreply] Merge pull request #15832 from [BEAM-13149] Changing readWithPartitions

[noreply] Minor: add resource key to presentation materials link (#15867)


------------------------------------------
[...truncated 48.69 KB...]
2dd1db7da1dd: Preparing
53c9017b1be0: Preparing
55df7bbac955: Preparing
4235ea42d291: Preparing
a15f236935d4: Preparing
78700b6b35d0: Preparing
2dd1db7da1dd: Waiting
62a5b8741e83: Preparing
53c9017b1be0: Waiting
36e0782f1159: Preparing
55df7bbac955: Waiting
ba6e5ff31f23: Preparing
4235ea42d291: Waiting
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
60268ebf88f8: Waiting
62a747bf1719: Preparing
62a5b8741e83: Waiting
ba6e5ff31f23: Waiting
36e0782f1159: Waiting
9f9f651e9303: Waiting
62a747bf1719: Waiting
0b3c02b5d746: Waiting
1ff23cddd2eb: Waiting
a15f236935d4: Waiting
97d68374f77d: Pushed
36bf3ba895ba: Pushed
922e1df58ed7: Pushed
71cfa21f6028: Pushed
20ebdb385def: Pushed
60268ebf88f8: Pushed
2dd1db7da1dd: Pushed
53c9017b1be0: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
4235ea42d291: Pushed
36e0782f1159: Layer already exists
1ff23cddd2eb: Pushed
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
a15f236935d4: Pushed
62a747bf1719: Layer already exists
55df7bbac955: Pushed
20211103124333: digest: sha256:df3ff75fba7ea1b2b418e87b8486098e715472f701f3e15685db6a4557f26568 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 03, 2021 12:45:38 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 03, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Nov 03, 2021 12:45:39 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 03, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Nov 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 860a91ac9ed28a235533986fd6cfa0e4f6cbefd47ea851fe7dad903d12c82c90> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hgqRrJ7SiiNVM5hv1s-g5PbL79R-qFH-fa2QPRLILJA.pb
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 03, 2021 12:45:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27e7c77f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f70a21b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91]
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 03, 2021 12:45:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0]
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 03, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-03_05_45_43-6875680358526166536?project=apache-beam-testing
Nov 03, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-03_05_45_43-6875680358526166536
Nov 03, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-03_05_45_43-6875680358526166536
Nov 03, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-03T12:45:50.887Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-glln. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:57.153Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:57.808Z: Expanding SplittableParDo operations into optimizable parts.
Nov 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:57.839Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:57.904Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:57.965Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:57.987Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.071Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.171Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.210Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.235Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.259Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.294Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.328Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.373Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.396Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.428Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.461Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.496Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.539Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.571Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.615Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.649Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.684Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.718Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.754Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.784Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.812Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.837Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.860Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:58.892Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 03, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:45:59.292Z: Starting 5 ****s in us-central1-a...
Nov 03, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:46:26.473Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 03, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:46:51.302Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 03, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:47:45.489Z: Workers have started successfully.
Nov 03, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T12:47:45.591Z: Workers have started successfully.
Nov 03, 2021 2:16:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-03T14:15:59.432Z: Staged package animal-sniffer-annotations-1.20-vt1E38otwrj1wIzR1vDgznQJTsZ3gSYJaOA_wOd1Iqw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/animal-sniffer-annotations-1.20-vt1E38otwrj1wIzR1vDgznQJTsZ3gSYJaOA_wOd1Iqw.jar' is inaccessible.
Nov 03, 2021 2:16:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-11-03T14:16:02.958Z: Staged package opencensus-contrib-http-util-0.28.0-ScPbKinx_bL3OSjL6pab0dQPq3zFu2JzAiur2W96eJs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-http-util-0.28.0-ScPbKinx_bL3OSjL6pab0dQPq3zFu2JzAiur2W96eJs.jar' is inaccessible.
Nov 03, 2021 2:16:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-03T14:16:03.509Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 03, 2021 2:19:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-03T14:19:03.274Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Nov 03, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T16:00:32.098Z: Cancel request is committed for workflow job: 2021-11-03_05_45_43-6875680358526166536.
Nov 03, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T16:00:32.146Z: Cleaning up.
Nov 03, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T16:00:32.233Z: Stopping **** pool...
Nov 03, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T16:00:32.290Z: Stopping **** pool...
Nov 03, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T16:02:52.793Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 03, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-03T16:02:52.860Z: Worker pool stopped.
Nov 03, 2021 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-03_05_45_43-6875680358526166536 finished with status CANCELLED.
Load test results for test (ID): e5970ce7-384f-4aa3-ae5e-7a85df0b2952 and timestamp: 2021-11-03T12:45:38.884000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11535.531
dataflow_v2_java11_total_bytes_count             2.25421996E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211103124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:df3ff75fba7ea1b2b418e87b8486098e715472f701f3e15685db6a4557f26568
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211103124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:df3ff75fba7ea1b2b418e87b8486098e715472f701f3e15685db6a4557f26568]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211103124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:df3ff75fba7ea1b2b418e87b8486098e715472f701f3e15685db6a4557f26568])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:df3ff75fba7ea1b2b418e87b8486098e715472f701f3e15685db6a4557f26568
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:df3ff75fba7ea1b2b418e87b8486098e715472f701f3e15685db6a4557f26568
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:df3ff75fba7ea1b2b418e87b8486098e715472f701f3e15685db6a4557f26568].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 43s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5ysaqeql23r7q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #138

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/138/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13145][Playground]

[avilovpavel6] Added cleanup os env before and after tests

[noreply] [BEAM-11936] Fix errorprone warnings (#15821)

[noreply] [BEAM-13001] fixes Golint issues (#15859)

[noreply] Merge pull request #15840 from [Playground][BEAM-13146][Bugfix]


------------------------------------------
[...truncated 48.50 KB...]
6f2d0f609fa2: Preparing
fbfe7e609ece: Preparing
2b9a9942b6c1: Preparing
5a652505ed5e: Preparing
acf4563f911e: Preparing
a8b9b2257dd2: Preparing
1efe0a188f85: Preparing
4f20cc1160e5: Preparing
49de65239a80: Preparing
f1ecb3cf56ea: Preparing
8065d226bf37: Preparing
1a2a9a996bfa: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
78700b6b35d0: Waiting
62a5b8741e83: Waiting
a8b9b2257dd2: Waiting
36e0782f1159: Waiting
1efe0a188f85: Waiting
ba6e5ff31f23: Waiting
9f9f651e9303: Waiting
4f20cc1160e5: Waiting
0b3c02b5d746: Waiting
49de65239a80: Waiting
62a747bf1719: Waiting
f1ecb3cf56ea: Waiting
8065d226bf37: Waiting
1a2a9a996bfa: Waiting
2b9a9942b6c1: Pushed
fbfe7e609ece: Pushed
acf4563f911e: Pushed
a8b9b2257dd2: Pushed
6f2d0f609fa2: Pushed
5a652505ed5e: Pushed
4f20cc1160e5: Pushed
49de65239a80: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
1a2a9a996bfa: Pushed
9f9f651e9303: Layer already exists
8065d226bf37: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
1efe0a188f85: Pushed
f1ecb3cf56ea: Pushed
20211102124331: digest: sha256:30ab3ece2033b5c69b96f0b184ad618b152dfd79c2c671732c3750af8b2476ea size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 02, 2021 12:45:35 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 02, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Nov 02, 2021 12:45:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 02, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 02, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 02, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Nov 02, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 02, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash f0f335e69457a4717c3a309eebb6f69db118f7d853ca4b045fe8aa506627b9bc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8PM15pRXpHF8OjCe67b2nbEY99hTyksEX-iqUGYnubw.pb
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 02, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd51d3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27e7c77f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f70a21b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608]
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 02, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301]
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 02, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-02_05_45_41-11128727296524182537?project=apache-beam-testing
Nov 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-02_05_45_41-11128727296524182537
Nov 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-02_05_45_41-11128727296524182537
Nov 02, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-02T12:45:47.977Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-mihj. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:53.037Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:53.863Z: Expanding SplittableParDo operations into optimizable parts.
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:53.897Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:53.998Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.080Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.100Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.179Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.286Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.325Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.346Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.368Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.424Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.455Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.502Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.555Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.594Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.630Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.669Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.702Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.736Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.760Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.783Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.813Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.837Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.861Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.884Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.910Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.953Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:54.985Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 02, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:55.012Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:45:55.373Z: Starting 5 ****s in us-central1-a...
Nov 02, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:46:27.695Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 02, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:46:35.270Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 02, 2021 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:47:31.354Z: Workers have started successfully.
Nov 02, 2021 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T12:47:31.387Z: Workers have started successfully.
Nov 02, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T16:00:23.175Z: Cancel request is committed for workflow job: 2021-11-02_05_45_41-11128727296524182537.
Nov 02, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T16:00:23.241Z: Cleaning up.
Nov 02, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T16:00:23.311Z: Stopping **** pool...
Nov 02, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T16:00:23.372Z: Stopping **** pool...
Nov 02, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T16:02:47.559Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 02, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-02T16:02:47.598Z: Worker pool stopped.
Nov 02, 2021 4:02:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-02_05_45_41-11128727296524182537 finished with status CANCELLED.
Load test results for test (ID): 160e81c4-91cd-4804-9359-1de69e181b0f and timestamp: 2021-11-02T12:45:36.005000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11532.046
dataflow_v2_java11_total_bytes_count             2.71981656E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211102124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ab3ece2033b5c69b96f0b184ad618b152dfd79c2c671732c3750af8b2476ea
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211102124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ab3ece2033b5c69b96f0b184ad618b152dfd79c2c671732c3750af8b2476ea]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211102124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ab3ece2033b5c69b96f0b184ad618b152dfd79c2c671732c3750af8b2476ea])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ab3ece2033b5c69b96f0b184ad618b152dfd79c2c671732c3750af8b2476ea
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ab3ece2033b5c69b96f0b184ad618b152dfd79c2c671732c3750af8b2476ea
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ab3ece2033b5c69b96f0b184ad618b152dfd79c2c671732c3750af8b2476ea].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 38s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4riyv33ietvow

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #137

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/137/display/redirect>

Changes:


------------------------------------------
[...truncated 49.30 KB...]
d79efce45f86: Preparing
fb963ca4c4dc: Preparing
9e7a4c8d4431: Preparing
964c39a7cd1c: Preparing
83596ea38e00: Preparing
3cdf2cfc0e4e: Preparing
13743e8f9b7b: Preparing
20d67df6a282: Preparing
3c6b195f73b0: Preparing
2e913ba9e385: Preparing
7cd745a02111: Preparing
314c40b596c3: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
2e913ba9e385: Waiting
7cd745a02111: Waiting
314c40b596c3: Waiting
78700b6b35d0: Waiting
3cdf2cfc0e4e: Waiting
9f9f651e9303: Waiting
13743e8f9b7b: Waiting
0b3c02b5d746: Waiting
ba6e5ff31f23: Waiting
62a5b8741e83: Waiting
62a747bf1719: Waiting
36e0782f1159: Waiting
20d67df6a282: Waiting
3c6b195f73b0: Waiting
83596ea38e00: Pushed
9e7a4c8d4431: Pushed
fb963ca4c4dc: Pushed
3cdf2cfc0e4e: Pushed
964c39a7cd1c: Pushed
d79efce45f86: Pushed
20d67df6a282: Pushed
3c6b195f73b0: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
7cd745a02111: Pushed
13743e8f9b7b: Pushed
36e0782f1159: Layer already exists
314c40b596c3: Pushed
9f9f651e9303: Layer already exists
ba6e5ff31f23: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
2e913ba9e385: Pushed
20211101124332: digest: sha256:1ff0762820c320f2491e17b8f7fb411d8ba5bc60567b11dacdeefe67d5d22793 size: 4311

> Task :sdks:java:testing:load-tests:run
Nov 01, 2021 12:45:18 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 01, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Nov 01, 2021 12:45:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 01, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 01, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 01, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Nov 01, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 01, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 2406fab948b04e175ce73b64bd32f7adc99b0f30c9e0bf834ad4d9577a1ca55c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-JAb6uUiwThdc5ztkvTL3rcmbDzDJ4L-DStTZV3ocpVw.pb
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 01, 2021 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27e7c77f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f70a21b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@617389a]
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 01, 2021 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c2dfa2]
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 01, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-01_05_45_24-11615703837044486865?project=apache-beam-testing
Nov 01, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-01_05_45_24-11615703837044486865
Nov 01, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-11-01_05_45_24-11615703837044486865
Nov 01, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-01T12:45:31.049Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-6llt. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:36.138Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:36.758Z: Expanding SplittableParDo operations into optimizable parts.
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:36.794Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:36.870Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:36.936Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:36.968Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.034Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.132Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.166Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.198Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.231Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.264Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.297Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.331Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.363Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.394Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.429Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.464Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.485Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.518Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.540Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.570Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.606Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.637Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.659Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.693Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.726Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.752Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.780Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:37.837Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 01, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:38.179Z: Starting 5 ****s in us-central1-a...
Nov 01, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:45:55.693Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 01, 2021 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:46:20.209Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 01, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:47:12.957Z: Workers have started successfully.
Nov 01, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T12:47:12.983Z: Workers have started successfully.
Nov 01, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T16:00:35.421Z: Cancel request is committed for workflow job: 2021-11-01_05_45_24-11615703837044486865.
Nov 01, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T16:00:35.523Z: Cleaning up.
Nov 01, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T16:00:35.603Z: Stopping **** pool...
Nov 01, 2021 4:00:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T16:00:35.648Z: Stopping **** pool...
Nov 01, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T16:02:55.578Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 01, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-01T16:02:55.616Z: Worker pool stopped.
Nov 01, 2021 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-11-01_05_45_24-11615703837044486865 finished with status CANCELLED.
Load test results for test (ID): 685ba138-1ce8-4559-9c77-2e5de61ed990 and timestamp: 2021-11-01T12:45:19.655000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11561.385
dataflow_v2_java11_total_bytes_count             2.56330316E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211101124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ff0762820c320f2491e17b8f7fb411d8ba5bc60567b11dacdeefe67d5d22793
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211101124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ff0762820c320f2491e17b8f7fb411d8ba5bc60567b11dacdeefe67d5d22793]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211101124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ff0762820c320f2491e17b8f7fb411d8ba5bc60567b11dacdeefe67d5d22793])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ff0762820c320f2491e17b8f7fb411d8ba5bc60567b11dacdeefe67d5d22793
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ff0762820c320f2491e17b8f7fb411d8ba5bc60567b11dacdeefe67d5d22793
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1ff0762820c320f2491e17b8f7fb411d8ba5bc60567b11dacdeefe67d5d22793].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 51s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zdx37fofkis34

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #136

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/136/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15753 from [BEAM-13080] Add option in Reshuffle to


------------------------------------------
[...truncated 48.49 KB...]
0fe811529e21: Preparing
c635961765b5: Preparing
0d2fce186cf1: Preparing
afc1c5da1be2: Preparing
7287cdf1103c: Preparing
59209505e674: Preparing
d2b2c114fca5: Preparing
59b99cfa4691: Preparing
7e1f83da258e: Preparing
54742e584f37: Preparing
c61eec161bba: Preparing
71ab1f18d33f: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
54742e584f37: Waiting
c61eec161bba: Waiting
59b99cfa4691: Waiting
71ab1f18d33f: Waiting
ba6e5ff31f23: Waiting
78700b6b35d0: Waiting
7e1f83da258e: Waiting
9f9f651e9303: Waiting
0b3c02b5d746: Waiting
62a747bf1719: Waiting
36e0782f1159: Waiting
62a5b8741e83: Waiting
d2b2c114fca5: Waiting
59209505e674: Waiting
0d2fce186cf1: Pushed
7287cdf1103c: Pushed
c635961765b5: Pushed
59209505e674: Pushed
0fe811529e21: Pushed
afc1c5da1be2: Pushed
59b99cfa4691: Pushed
7e1f83da258e: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
c61eec161bba: Pushed
62a747bf1719: Layer already exists
d2b2c114fca5: Pushed
71ab1f18d33f: Pushed
54742e584f37: Pushed
20211031124332: digest: sha256:0f47d3d3b4a01ef3041f36347f38df837db674e18059b5f6b4e635d281a81b80 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 31, 2021 12:45:19 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 31, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 31, 2021 12:45:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 31, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 31, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 31, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 31, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 31, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 34bce12ab4fe2cf85115ed12731879f77613ddfaaac5dae463260061de728f33> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NLzhKrT-LPhRFe0Scxh593YT3fqqxdrkYyYAYd5yjzM.pb
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 31, 2021 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd51d3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27e7c77f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f70a21b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608]
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 31, 2021 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301]
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 31, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-31_05_45_25-909215253647654451?project=apache-beam-testing
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-31_05_45_25-909215253647654451
Oct 31, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-31_05_45_25-909215253647654451
Oct 31, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-31T12:45:34.897Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-h9fm. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:38.961Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.561Z: Expanding SplittableParDo operations into optimizable parts.
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.607Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.676Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.725Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.745Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.800Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.910Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.937Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.966Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:39.993Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.016Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.045Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.066Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.097Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.131Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.161Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.200Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.235Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.267Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.291Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.316Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.345Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.371Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.413Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.437Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.471Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.494Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.527Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.562Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 31, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:40.884Z: Starting 5 ****s in us-central1-a...
Oct 31, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:45:46.342Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 31, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:46:25.432Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 31, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:47:23.242Z: Workers have started successfully.
Oct 31, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T12:47:23.271Z: Workers have started successfully.
Oct 31, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T16:00:23.805Z: Cancel request is committed for workflow job: 2021-10-31_05_45_25-909215253647654451.
Oct 31, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T16:00:23.860Z: Cleaning up.
Oct 31, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T16:00:23.911Z: Stopping **** pool...
Oct 31, 2021 4:00:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T16:00:23.954Z: Stopping **** pool...
Oct 31, 2021 4:02:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T16:02:38.765Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 31, 2021 4:02:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-31T16:02:38.799Z: Worker pool stopped.
Oct 31, 2021 4:02:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-31_05_45_25-909215253647654451 finished with status CANCELLED.
Load test results for test (ID): 7e6a4f9b-ba60-4c25-a4ed-7a68ff45ce29 and timestamp: 2021-10-31T12:45:20.418000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11536.735
dataflow_v2_java11_total_bytes_count             2.32191087E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211031124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0f47d3d3b4a01ef3041f36347f38df837db674e18059b5f6b4e635d281a81b80
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211031124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0f47d3d3b4a01ef3041f36347f38df837db674e18059b5f6b4e635d281a81b80]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211031124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0f47d3d3b4a01ef3041f36347f38df837db674e18059b5f6b4e635d281a81b80])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0f47d3d3b4a01ef3041f36347f38df837db674e18059b5f6b4e635d281a81b80
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0f47d3d3b4a01ef3041f36347f38df837db674e18059b5f6b4e635d281a81b80
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0f47d3d3b4a01ef3041f36347f38df837db674e18059b5f6b4e635d281a81b80].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 28s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/d2svplfodkiw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #135

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/135/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-13015] Migrate bundle processing in the SDK harness to using

[noreply] [BEAM-8958] Use AWS credentials provider with BasicKinesisProvider

[Brian Hulette] [BEAM-13099] Copy vendored calcite build for 1.28.0

[Brian Hulette] [BEAM-13099] Modifications for vendor/calcite-1_28_0 build

[noreply] [BEAM-13099] Use BlockBuilder.add(..) rather than

[noreply] Minor: Remove broken python compatibility checks (#15828)

[noreply] [BEAM-12047] Updates CHANGES.md to mention the URN convention (#15845)

[noreply] [BEAM-13001] DoFn metrics for Go SDK (#15657)


------------------------------------------
[...truncated 48.51 KB...]
62ffb163308a: Preparing
a3e8466288ca: Preparing
af59f67fb9eb: Preparing
aa0b48ac2a87: Preparing
72ac93015a3a: Preparing
f66cbc051eb8: Preparing
bc30c76e1ea6: Preparing
f365b800c5c7: Preparing
81ceaa9f8013: Preparing
46d790545cf1: Preparing
cafa2d3a4a5a: Preparing
d0a858d06060: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
f66cbc051eb8: Waiting
ba6e5ff31f23: Waiting
d0a858d06060: Waiting
9f9f651e9303: Waiting
bc30c76e1ea6: Waiting
62a747bf1719: Waiting
0b3c02b5d746: Waiting
f365b800c5c7: Waiting
36e0782f1159: Waiting
cafa2d3a4a5a: Waiting
81ceaa9f8013: Waiting
46d790545cf1: Waiting
78700b6b35d0: Waiting
62a5b8741e83: Waiting
72ac93015a3a: Pushed
af59f67fb9eb: Pushed
a3e8466288ca: Pushed
f66cbc051eb8: Pushed
aa0b48ac2a87: Pushed
f365b800c5c7: Pushed
62ffb163308a: Pushed
81ceaa9f8013: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
cafa2d3a4a5a: Pushed
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
0b3c02b5d746: Layer already exists
9f9f651e9303: Layer already exists
62a747bf1719: Layer already exists
bc30c76e1ea6: Pushed
d0a858d06060: Pushed
46d790545cf1: Pushed
20211030124336: digest: sha256:ca577b4f870e432816af8a2bc181a81bdc1fed3857c3e38c57a5bc5cc66dc5ef size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 30, 2021 12:45:31 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 30, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 30, 2021 12:45:32 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 30, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 30, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 30, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 30, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 30, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 0a61e45cf248142a8466894ba36c31242228cfac6e4c35bc6eb64e4aa62646a3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CmHkXPJIFCqEZolLo2wxJCIoz6xuTDW8brZOSqYmRqM.pb
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 30, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27e7c77f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f70a21b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91]
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 30, 2021 12:45:36 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0]
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 30, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 30, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-30_05_45_36-1707718346160858959?project=apache-beam-testing
Oct 30, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-30_05_45_36-1707718346160858959
Oct 30, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-30_05_45_36-1707718346160858959
Oct 30, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-30T12:45:43.393Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-too8. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 30, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:47.528Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 30, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.328Z: Expanding SplittableParDo operations into optimizable parts.
Oct 30, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.359Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.462Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.534Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.562Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.611Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.673Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.689Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.716Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.735Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.758Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.781Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.803Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.827Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.848Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.872Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.885Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.921Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.944Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:48.967Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.000Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.035Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.064Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.090Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.122Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.151Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.174Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.240Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.266Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 30, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:45:49.556Z: Starting 5 ****s in us-central1-a...
Oct 30, 2021 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:46:20.123Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 30, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:46:31.368Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 30, 2021 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:47:27.023Z: Workers have started successfully.
Oct 30, 2021 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T12:47:27.068Z: Workers have started successfully.
Oct 30, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T16:00:24.234Z: Cancel request is committed for workflow job: 2021-10-30_05_45_36-1707718346160858959.
Oct 30, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T16:00:24.303Z: Cleaning up.
Oct 30, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T16:00:24.365Z: Stopping **** pool...
Oct 30, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T16:00:24.410Z: Stopping **** pool...
Oct 30, 2021 4:02:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T16:02:46.104Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 30, 2021 4:02:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-30T16:02:46.142Z: Worker pool stopped.
Oct 30, 2021 4:02:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-30_05_45_36-1707718346160858959 finished with status CANCELLED.
Load test results for test (ID): 429ea241-71e6-40cd-ab5f-05ebe6f1df4f and timestamp: 2021-10-30T12:45:32.021000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11536.936
dataflow_v2_java11_total_bytes_count             1.95846834E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211030124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ca577b4f870e432816af8a2bc181a81bdc1fed3857c3e38c57a5bc5cc66dc5ef
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211030124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ca577b4f870e432816af8a2bc181a81bdc1fed3857c3e38c57a5bc5cc66dc5ef]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211030124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ca577b4f870e432816af8a2bc181a81bdc1fed3857c3e38c57a5bc5cc66dc5ef])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ca577b4f870e432816af8a2bc181a81bdc1fed3857c3e38c57a5bc5cc66dc5ef
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ca577b4f870e432816af8a2bc181a81bdc1fed3857c3e38c57a5bc5cc66dc5ef
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ca577b4f870e432816af8a2bc181a81bdc1fed3857c3e38c57a5bc5cc66dc5ef].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 33s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xdi223qrw356q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #134

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/134/display/redirect?page=changes>

Changes:

[artur.khanin] Implemented GetCompileOutput and covered with unit tests

[artur.khanin] Replaced logging library regarding PR comment

[daria.malkova] fix executor tests

[mmack] [BEAM-13138] Update code to retrieve container endpoint

[aydar.zaynutdinov] [BEAM-13120][Playground]

[noreply] Merge pull request #15709 from [BEAM-12967] [Playground] Create Example

[noreply] Merge pull request #15294 from [BEAM-11986] Spanner write metric

[noreply] [BEAM-13130] Remove persistent references to stateKeyReaders (#15815)

[Brian Hulette] [BEAM-13099] Replace call to RelNode.metadata with BeamRelMetadataQuery

[Brian Hulette] [BEAM-13099] Update all rels to use BeamRelMetadataQuery

[Kyle Weaver] [BEAM-13143] Fix python doc generator error.

[Pablo Estrada] Handle runner-provided shards for TextIO

[noreply] Merge pull request #15721 from [BEAM-13023][Playground] Implement Redis

[noreply] Merge pull request #15802 from [BEAM-12970][Playground] Implement gRPC

[noreply] Merge pull request #15800 from [BEAM-12970][Playground] Implement gRPC


------------------------------------------
[...truncated 49.03 KB...]
0b3c02b5d746: Preparing
62a747bf1719: Preparing
c30b6163b38e: Waiting
8ce6e5c48199: Waiting
5ecec3e50f1b: Waiting
36e0782f1159: Waiting
b30ebf97fccb: Waiting
37f8420a0833: Waiting
78700b6b35d0: Waiting
62a747bf1719: Waiting
ba6e5ff31f23: Waiting
9f9f651e9303: Waiting
ec6ed34ef00f: Waiting
dcbb5dc233cc: Waiting
62a5b8741e83: Waiting
582395212cc7: Pushed
26d172e729b8: Pushed
c89c93be1ca3: Pushed
258bf8261d25: Pushed
c30b6163b38e: Pushed
3ae428a4dd9b: Pushed
8ce6e5c48199: Pushed
37f8420a0833: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
ec6ed34ef00f: Pushed
5ecec3e50f1b: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
9f9f651e9303: Layer already exists
dcbb5dc233cc: Pushed
b30ebf97fccb: Pushed
20211029124334: digest: sha256:4f56c4f2248507a6ef032216b3cba8d4f325946a16fa7090a368d818948b0dd2 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 29, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 29, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 29, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 29, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 29, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 29, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 29, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 29, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 206278b525d10e9af199c1546290e40d06e1bfcd6a4d1e491b2df840adbdc45d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IGJ4tSXRDprxmcFUYpDkDQbhv81qTR5JGy34QK29xF0.pb
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 29, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27e7c77f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f70a21b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@617389a]
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 29, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c2dfa2]
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 29, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 29, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 29, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-29_05_45_28-16658831343352750248?project=apache-beam-testing
Oct 29, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-29_05_45_28-16658831343352750248
Oct 29, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-29_05_45_28-16658831343352750248
Oct 29, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-29T12:45:40.721Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-5bwi. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 29, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:52.807Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 29, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:53.753Z: Expanding SplittableParDo operations into optimizable parts.
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:53.853Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:53.935Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:53.994Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.030Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.094Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.204Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.239Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.263Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.297Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.328Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.359Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.386Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.421Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.444Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.467Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.499Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.532Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.561Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.597Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.619Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.643Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.680Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.713Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.749Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.774Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.814Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.840Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:54.877Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 29, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:45:55.352Z: Starting 5 ****s in us-central1-a...
Oct 29, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:46:09.258Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 29, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:46:40.686Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 29, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:47:35.788Z: Workers have started successfully.
Oct 29, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T12:47:35.820Z: Workers have started successfully.
Oct 29, 2021 3:51:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-29T15:51:55.911Z: Staged package amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar' is inaccessible.
Oct 29, 2021 3:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-29T15:51:58.406Z: Staged package jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar' is inaccessible.
Oct 29, 2021 3:51:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-29T15:51:58.454Z: Staged package jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar' is inaccessible.
Oct 29, 2021 3:52:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-29T15:51:59.536Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 29, 2021 3:55:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-29T15:54:59.216Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 29, 2021 3:57:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-29T15:57:55.516Z: Staged package amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar' is inaccessible.
Oct 29, 2021 3:57:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-29T15:57:58.185Z: Staged package jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar' is inaccessible.
Oct 29, 2021 3:57:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-29T15:57:58.225Z: Staged package jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar' is inaccessible.
Oct 29, 2021 3:58:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-29T15:57:59.313Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 29, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T16:00:28.573Z: Cancel request is committed for workflow job: 2021-10-29_05_45_28-16658831343352750248.
Oct 29, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T16:00:28.606Z: Cleaning up.
Oct 29, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T16:00:28.671Z: Stopping **** pool...
Oct 29, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T16:00:28.725Z: Stopping **** pool...
Oct 29, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T16:02:51.239Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 29, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-29T16:02:51.282Z: Worker pool stopped.
Oct 29, 2021 4:02:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-29_05_45_28-16658831343352750248 finished with status CANCELLED.
Load test results for test (ID): be533bbc-7af2-421b-9f7e-d37632c77237 and timestamp: 2021-10-29T12:45:23.184000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11535.407
dataflow_v2_java11_total_bytes_count             2.60500954E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211029124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f56c4f2248507a6ef032216b3cba8d4f325946a16fa7090a368d818948b0dd2
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211029124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f56c4f2248507a6ef032216b3cba8d4f325946a16fa7090a368d818948b0dd2]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211029124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f56c4f2248507a6ef032216b3cba8d4f325946a16fa7090a368d818948b0dd2])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f56c4f2248507a6ef032216b3cba8d4f325946a16fa7090a368d818948b0dd2
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f56c4f2248507a6ef032216b3cba8d4f325946a16fa7090a368d818948b0dd2
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f56c4f2248507a6ef032216b3cba8d4f325946a16fa7090a368d818948b0dd2].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 41s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mbbhbhsxp2a3q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #133

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/133/display/redirect?page=changes>

Changes:

[relax] support json

[aydar.zaynutdinov] [BEAM-12970][Playground]

[Andrew Pilloud] [BEAM-13118] Don't force push to master

[aydar.zaynutdinov] [BEAM-12970][Playground]

[eugene.nikolayev] Fix Python docs build pre-commit failures.

[aydar.zaynutdinov] [BEAM-12970][Playground]

[noreply] Merge pull request #15731: [BEAM-13067]  Mark GroupIntoBatches output as

[noreply] [BEAM-13066, BEAM-13087] Workaround for coder type combiner packing

[noreply] [BEAM-13066] Disable using abstract iterable by default. (#15805)

[aydar.zaynutdinov] [BEAM-12970][Playground]


------------------------------------------
[...truncated 49.66 KB...]
e26e7b75e334: Waiting
8228897593b2: Waiting
62a747bf1719: Waiting
b1a9e1733fc1: Waiting
0b3c02b5d746: Waiting
00ab2a787668: Pushed
388e7180b083: Pushed
5ebf909675aa: Pushed
56fb06668e82: Pushed
78a36dbe60ed: Pushed
f90c75b6eb72: Pushed
a0ac71b586b2: Pushed
1ec9e1de6222: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
8228897593b2: Pushed
0b3c02b5d746: Layer already exists
b1a9e1733fc1: Pushed
62a747bf1719: Layer already exists
91e2eeb5b486: Pushed
e26e7b75e334: Pushed
20211028124334: digest: sha256:45b73fe66dba08219be66a8bfb80995617d5dd6f86954ea5cc1b0a716ecf80a4 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 28, 2021 12:45:46 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 28, 2021 12:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 28, 2021 12:45:47 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 28, 2021 12:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 28, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 28, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 28, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 28, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash e8c831ec15148909a4024f127ebe2cb158a84663463ea98c935c2b2399f4861d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6Mgx7BUUiQmkAk8Sfr4ssVioRmNGPqmMk1wrI5n0hh0.pb
Oct 28, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 28, 2021 12:45:52 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd51d3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27e7c77f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f70a21b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608]
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 28, 2021 12:45:52 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301]
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 28, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-28_05_45_52-1818316849038594164?project=apache-beam-testing
Oct 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-28_05_45_52-1818316849038594164
Oct 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-28_05_45_52-1818316849038594164
Oct 28, 2021 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-28T12:45:59.114Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-69kp. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:02.703Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.428Z: Expanding SplittableParDo operations into optimizable parts.
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.463Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.529Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.597Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.636Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.715Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.835Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.882Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.913Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.947Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:03.973Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.005Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.039Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.074Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.114Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.151Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.196Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.222Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.282Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.322Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.344Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.377Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.413Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.446Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.475Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.503Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.561Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.588Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.623Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 28, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:04.948Z: Starting 5 ****s in us-central1-a...
Oct 28, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:28.643Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 28, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:46:55.045Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 28, 2021 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:47:50.846Z: Workers have started successfully.
Oct 28, 2021 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T12:47:50.875Z: Workers have started successfully.
Oct 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T16:00:31.765Z: Cancel request is committed for workflow job: 2021-10-28_05_45_52-1818316849038594164.
Oct 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T16:00:31.826Z: Cleaning up.
Oct 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T16:00:31.906Z: Stopping **** pool...
Oct 28, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T16:00:31.969Z: Stopping **** pool...
Oct 28, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T16:02:54.314Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 28, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-28T16:02:54.353Z: Worker pool stopped.
Oct 28, 2021 4:03:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-28_05_45_52-1818316849038594164 finished with status CANCELLED.
Load test results for test (ID): 86586faa-85c5-475e-bb26-f279091994f4 and timestamp: 2021-10-28T12:45:47.153000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11535.023
dataflow_v2_java11_total_bytes_count             2.01960346E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211028124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45b73fe66dba08219be66a8bfb80995617d5dd6f86954ea5cc1b0a716ecf80a4
Deleted: sha256:7b8a540f712ea0eb598ca8ddd270c5710ac891842758fa2d3455c0db502f40ab
Deleted: sha256:f372086c4f5ae7c7c0ce763fac9dc6bc4161736301f5aaec66878b99b0502529
Deleted: sha256:458ecf7a865d8eba4376c70146397e0d5b02237745ccca939932e41d96735d16
Deleted: sha256:46ea50f1866802b31d07b4ab19c4fe70eaa990075daed270f88a27ee6e9d15f3
Deleted: sha256:b07d659ec5f4ed01f6548df29c5749e72d8b2177b51846ad8b9c02b01fa86a37
Deleted: sha256:dfe0f31e6e8c8a66fcc6235dae26630e120ef103bc016d77cf6247e53ce832b3
Deleted: sha256:ff388a048e8268cb88e9cdabaf56126c685e6156af10d2fa02126cf43bd6de5f
Deleted: sha256:309a59039c3fc7e4f0961bccda512d9fd6243bdd0d93e24bb72da28e6a74d651
Deleted: sha256:1960228fd9a804111d33d6fa1a462fed3d1385ece321a13b4c9bfe7d2eacf109
Deleted: sha256:239020881b140115586d9538f1ebc55effe8ec829942ab2b0e4ccbcfbfe32e30
Deleted: sha256:0543010a40f2c412a69a538eb3f472afd1e6799de50774caa90907e1252d9c30
Deleted: sha256:48a9f8e53cb05195f2a0ba5e69090ed4dfce8fec86bb525fc3384a5bddb7de2e
Deleted: sha256:dda7348a148f4058c9ba0d141421c7458590e30619c36e3c635ab2c04bb967b6
Deleted: sha256:fc422520ced6e158c3e0d16e44679f829892d5ef7cbdd5180b18eb6ecb3e4196
Deleted: sha256:7915ed67eb5ed08f5945bb3be29e2929e6fc3a3901d601ff1ef5a7df94607b68
Deleted: sha256:04f804135e73a663e67bc2ea1ce7cdb6f4ba0433364db5bebe9b670c74d928d5
Deleted: sha256:64c78e03070c97889adea60a295fe3249cb2d0d8f8de214579679b96ed81ac3d
Deleted: sha256:20dacd2ebe0b995e99a99aeb1379845ab556f4dc4a46a7999225953e1137c125
Deleted: sha256:bc97d320597c908480ef5b16e75151a9709dd99105b7f4e7208477783354e194
Deleted: sha256:e501e220c159658f272666a412f7fa81fcecc117c2d119854abcd0f27a4b0a56
Deleted: sha256:978b6f728cb2549bc8febbdf82552535701ff9a030c62e836d3c9255b7424cc7
Deleted: sha256:661bb97f38323578ff4497b96cf563f3a3202bb5dd36399daf4c6bfa80e99e17
Deleted: sha256:0d620287c71344ac5ee1423d8e349e7e6b955774457b1fa06cbab7e3873fba96
Deleted: sha256:477cc05977e4d760baa2b03387050ece45b170c7dbde14883f16224bd8c334c1
Deleted: sha256:bfc349587255729d9565b1997d69e9b82a13c748e2913b4fb89a721b6f1fc2e9
Deleted: sha256:7600d5c7e778546d22c284952da4e6a9449e18df9190a03c598b98c10de4740d
Deleted: sha256:17814636cff4f3683b8f05b3c293d63819f462bb24b572915c82e2602c0e73d7
Deleted: sha256:bf514f3871b7fb954332f045376680a44cb3e86f572d65e8989060e699df4612
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211028124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45b73fe66dba08219be66a8bfb80995617d5dd6f86954ea5cc1b0a716ecf80a4]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211028124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45b73fe66dba08219be66a8bfb80995617d5dd6f86954ea5cc1b0a716ecf80a4])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45b73fe66dba08219be66a8bfb80995617d5dd6f86954ea5cc1b0a716ecf80a4
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45b73fe66dba08219be66a8bfb80995617d5dd6f86954ea5cc1b0a716ecf80a4
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45b73fe66dba08219be66a8bfb80995617d5dd6f86954ea5cc1b0a716ecf80a4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 44s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/eujrlkngy34li

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #132

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/132/display/redirect?page=changes>

Changes:

[baeminbo] [BEAM-6721] Set numShards dynamically for TextIO.write()

[heejong] [BEAM-12978] Customizable dependency for Java external transform

[melissapa] [BEAM-11758] Update basics page: Splittable DoFn

[heejong] add ConfigT param to getDependencies method

[Alexey Romanenko] [BEAM-13104] ParquetIO: SplitReadFn must read the whole block

[noreply] Change sql.Options to an interface under sqlx. (#15790)

[melissapa] Address review feedback

[noreply] [BEAM-4149] Ensure that we always provide and require the worker id.

[noreply] [BEAM-13098] Fix translation of repeated TableRow fields (#15779)

[mmack] [BEAM-8958] Use AWS credentials provider with BasicKinesisProvider (AWS


------------------------------------------
[...truncated 50.04 KB...]
0b3c02b5d746: Waiting
a1157078efa7: Pushed
f1f6158ece23: Pushed
ca9cdaf33614: Pushed
d06b9c13e0cc: Pushed
f3f8ce23f305: Pushed
b1513d622f1f: Pushed
fb827792f82e: Pushed
1a8f734413d4: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
c6149d467a89: Pushed
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
7c70d01ceb3f: Pushed
e95d1c8a7989: Pushed
0e9f430337a7: Pushed
20211027124331: digest: sha256:a9836a031fe12e537fab384f6e8347c580e4f4bee44384a5e4f40294e504d354 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 27, 2021 12:45:20 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 27, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 27, 2021 12:45:21 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 27, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 27, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 27, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 1 seconds
Oct 27, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 27, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash cf636970a24e703fddec73b4e5776da0dfdc9be12795ffd4683708b270884f75> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-z2NpcKJOcD_d7HO05XdtoN_cm-Enlf_UaDcIsnCIT3U.pb
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 27, 2021 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d805aa1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30ca0779, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58740366, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47be0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2bc426f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd51d3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b]
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 27, 2021 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918]
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 27, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 27, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-27_05_45_26-11863955454072432916?project=apache-beam-testing
Oct 27, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-27_05_45_26-11863955454072432916
Oct 27, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-27_05_45_26-11863955454072432916
Oct 27, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-27T12:45:34.064Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-zy8s. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 27, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:39.151Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:39.972Z: Expanding SplittableParDo operations into optimizable parts.
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.001Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.070Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.153Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.187Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.266Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.371Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.397Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.444Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.481Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.509Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.567Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.599Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.643Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.671Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.714Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.753Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.786Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 27, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.820Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.851Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.885Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.912Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.944Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:40.979Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:41.013Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:41.045Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:41.073Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:41.105Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:41.134Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 27, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:41.551Z: Starting 5 ****s in us-central1-a...
Oct 27, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:45:54.265Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 27, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:46:26.561Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 27, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:46:26.649Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Oct 27, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:46:36.969Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 27, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:47:28.303Z: Workers have started successfully.
Oct 27, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T12:47:28.536Z: Workers have started successfully.
Oct 27, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T16:00:24.885Z: Cancel request is committed for workflow job: 2021-10-27_05_45_26-11863955454072432916.
Oct 27, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T16:00:28.844Z: Cleaning up.
Oct 27, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T16:00:29.026Z: Stopping **** pool...
Oct 27, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T16:00:29.115Z: Stopping **** pool...
Oct 27, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T16:02:51.327Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 27, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-27T16:02:51.363Z: Worker pool stopped.
Oct 27, 2021 4:02:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-27_05_45_26-11863955454072432916 finished with status CANCELLED.
Load test results for test (ID): 22aa9eb4-7c1b-40ef-a3fd-377cfa270688 and timestamp: 2021-10-27T12:45:21.120000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11468.212
dataflow_v2_java11_total_bytes_count             2.41295807E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211027124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a9836a031fe12e537fab384f6e8347c580e4f4bee44384a5e4f40294e504d354
Deleted: sha256:29a2fb30aabd6306152aea1c69a13e6161b03e9308e3a2771c37aacdb9f95556
Deleted: sha256:4fcdc721c79e58a7657fe1f07320c5b17ca1e681701504dcb8c1118b7b204126
Deleted: sha256:3eb87ca63c0408c9821d87f75407fca2356d9d57040fdda86ba0a62c0c96841a
Deleted: sha256:6011c3fe28996fcfcfa7f524cf616d6d08b8e4bed230eff3b48a7a49a68030f3
Deleted: sha256:9c9183b684eb3424ba90fab5516b2aa37db460768d23a17cf88730a9a63178de
Deleted: sha256:1c38578c3431f8c9ee3b49b5ee2d0e9599f27a64fa6902cfd3dd2edaac4fc904
Deleted: sha256:5fde1c63f53c457a9ed24bd050bacd89a43251164ce7cfafe7accf29a31c592a
Deleted: sha256:3dbfb0a3c5ad0267c36289166b5bfe23a5bfd346932faf855af7e183074de86e
Deleted: sha256:9f059aa9a9adc643802c7a6b3dce15c9b87178be55058bac0407838803e0734a
Deleted: sha256:f708312fbf668adfda1e4e92168f2be02f9d4b5f7799517531343acdbfba3658
Deleted: sha256:0b25710b042a20e99ce0605efa0454565b2a72f9f5e46bb91e2bab7d7f1c7367
Deleted: sha256:484cdea898879b77713ed2895d89e34ba45e36138815eddd146f3beac6e3d74d
Deleted: sha256:e3e97a9fc570a6cd89cf1e29bb6684bee9981ccad0f92d4f645288d47aabd138
Deleted: sha256:84886734949dee63c66d8a28b79744627f3f281c8e6a4d17384061e32d608d66
Deleted: sha256:3caaab00c05f6dfc030aa0595e4cea9b535b772c1858b9df96a5dcdaffaf8a7b
Deleted: sha256:a3250ae93ff833ac5142c19be09a14b2c77ab6b6148033b238eba8552cfeb98b
Deleted: sha256:f694412198b1a3a88e22608bf6acfb9c72c298bdcf2e8daa1a633917f3b1b5fa
Deleted: sha256:ff21c0dab017c561d384d8a7b6cc011aab5e0d70abdde54c21ada35d9e813f57
Deleted: sha256:5cf651323ddd4b2ae4f0e1d7c1346abd51960e1dbcea714ac1b166bcb96f9fef
Deleted: sha256:a57adda0fc8c30c95eb4b3fa522d0a10d1fdb06421dcbc999062987b4531e240
Deleted: sha256:cfc7ace7196831e4cba0cffbd93bf06aa5886b07d1cc7d1231302d503a449fc5
Deleted: sha256:60cbbe023d727747a080fc87dc3542993ae473c442fd1b61a1cd512bfdd39853
Deleted: sha256:1395c040125fca7d96ceba8cc0a27a219b49534df00f5797c3d1a94e419c8963
Deleted: sha256:8dc30115a541273a64c24df0cb6fe5f83955955eab5de2da7588b376d95dc720
Deleted: sha256:ce966b3fe9d0f2837d297660d692ce3dc72c2e4589ca22f1f3c70588a4e7665f
Deleted: sha256:7bc284fc567a3e35be0f9a04d39b93adf8486074888157675da5dc8a93d5ebcc
Deleted: sha256:b677c0d8b1da00e9d43d979e6496c6fc2203fb70e34e12fd8d3193abf2d73eab
Deleted: sha256:bc7d5a1df07c51e3a4d8b95220ae0b2098a9cee9e2c6bdb725b45ba6023d134d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211027124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a9836a031fe12e537fab384f6e8347c580e4f4bee44384a5e4f40294e504d354]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211027124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a9836a031fe12e537fab384f6e8347c580e4f4bee44384a5e4f40294e504d354])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a9836a031fe12e537fab384f6e8347c580e4f4bee44384a5e4f40294e504d354
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a9836a031fe12e537fab384f6e8347c580e4f4bee44384a5e4f40294e504d354
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a9836a031fe12e537fab384f6e8347c580e4f4bee44384a5e4f40294e504d354].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 43s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/okbgq2jqeo3fw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #131

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/131/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-12522] Enable side inputs on all splittable DoFn execution time

[aydar.zaynutdinov] [BEAM-13062][Playground]

[ilya.kozyrev] Add environment_service.go and structures for beam sdk, network envs,

[aydar.zaynutdinov] [BEAM-13062][Playground]

[noreply] Merge pull request #15772 from [BEAM-13032] [Playground] Implement the

[noreply] Merge pull request #15714 from [BEAM-13005] [Playground] Implement local

[noreply] Merge pull request #15770 from [BEAM-13095][Playground] Using working

[noreply] [BEAM-11758] Update basics page: Aggregation, Runner, UDF, Schema

[noreply] Merge pull request #15783 from [BEAM-13048] [Playground] Add shortcuts

[Luke Cwik] [BEAM-8543] Fix test filtering for Dataflow Runner V2 to exclude

[noreply] Merge pull request #15744 from [BEAM-13072][Playground] Executor builder


------------------------------------------
[...truncated 49.23 KB...]
44671907d352: Pushed
22c6cd0d30a3: Pushed
8f8c6ec877ad: Pushed
90a44032420d: Pushed
7eea6e1f20fe: Pushed
9996e7cf888a: Pushed
9ee9aabadd72: Pushed
2116646bc5a6: Pushed
c9bb102d5ff2: Pushed
78700b6b35d0: Layer already exists
b6f1f004368f: Pushed
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
0b3c02b5d746: Layer already exists
9f9f651e9303: Layer already exists
62a747bf1719: Layer already exists
358de67a5df2: Pushed
20211026124331: digest: sha256:c38f0177a7ef3f57599c16a2ef5297bf189e09217ebd02d1afefe1ff41eb7208 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 26, 2021 12:45:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 26, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 26, 2021 12:45:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 26, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 26, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 26, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 26, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 26, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 64e91be1f694bcc161b27a0aac7b95ea2b0777def76fda00e27b537035695b40> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ZOkb4faUvMFhsnoKrHuV6isHd973b9oA4ntTcDVpW0A.pb
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 26, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176f7f3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30ca0779, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58740366, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47be0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2bc426f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4bd51d3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33425811, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b74b35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e4e1ef5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d11ceef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cb2918c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72e295cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@c2584d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fa0450e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37468787, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51ec2856, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@714f3da4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1caa9eb6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f53481b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fcd7d3f]
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 26, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@169268a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63]
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 26, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-26_05_45_23-5340577711597487272?project=apache-beam-testing
Oct 26, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-26_05_45_23-5340577711597487272
Oct 26, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-26_05_45_23-5340577711597487272
Oct 26, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-26T12:45:34.029Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-2trs. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 26, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:38.516Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.311Z: Expanding SplittableParDo operations into optimizable parts.
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.348Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.418Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.489Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.533Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.598Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.709Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.746Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.767Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.800Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.823Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.848Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.878Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.922Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.960Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:39.983Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.010Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.041Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.074Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.096Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.119Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.144Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.165Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.201Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.239Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.266Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.289Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.316Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.346Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 26, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:45:40.675Z: Starting 5 ****s in us-central1-a...
Oct 26, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:46:13.855Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 26, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:46:26.776Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 26, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:47:22.202Z: Workers have started successfully.
Oct 26, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T12:47:22.227Z: Workers have started successfully.
Oct 26, 2021 4:00:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T15:59:59.746Z: Workers have started successfully.
Oct 26, 2021 4:00:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:00:05.068Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 26, 2021 4:00:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:00:14.720Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 26, 2021 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:00:44.681Z: Cancel request is committed for workflow job: 2021-10-26_05_45_23-5340577711597487272.
Oct 26, 2021 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:00:44.700Z: Cleaning up.
Oct 26, 2021 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:00:44.760Z: Stopping **** pool...
Oct 26, 2021 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:00:44.812Z: Stopping **** pool...
Oct 26, 2021 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:03:07.035Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 26, 2021 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-26T16:03:07.092Z: Worker pool stopped.
Oct 26, 2021 4:03:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-26_05_45_23-5340577711597487272 finished with status CANCELLED.
Load test results for test (ID): 144ef194-9ac1-45b7-ae35-bf04bf36890b and timestamp: 2021-10-26T12:45:18.359000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11561.244
dataflow_v2_java11_total_bytes_count             3.54599776E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211026124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c38f0177a7ef3f57599c16a2ef5297bf189e09217ebd02d1afefe1ff41eb7208
Deleted: sha256:0173f7450c392d6bb14fb4de6d46eb67dc7d9f283ab333c9ac1037685e0915e8
Deleted: sha256:229ced64d33038238616dbc315a9d53fc6daf3977c20ae2e3d8f7ca2236fc593
Deleted: sha256:a4cb16f3b6cc8b9b6209b909d864ce561a487faccb718b0c0533052f67430a12
Deleted: sha256:af9e27166951d99e7c71fdf97683616a40d3dc21692bbb0b4e37e02ba4585be9
Deleted: sha256:75225567acd366f7ef58e52b6afa57fb82a2d5c3024de09039ca8312b65c4eb7
Deleted: sha256:405b84633693e8d19eabc2c3830bca8b24173b3caf08963d10d9d74d66ab5511
Deleted: sha256:830e2b5f97458c160a1e81d2ed8d806d923cbba194104bc0cb892938b0e9fd79
Deleted: sha256:9aa97ded24ce51414ee4a3c60a4b5488cb07fd09305a7a72e7e7f270a5959ca6
Deleted: sha256:c1a99a5069e1065f7cbc24158f6d9993d9c15a0851967949220e182fcf5a9c7f
Deleted: sha256:cbf9161771b41b1989da9f333a137358b469a5cf0eaeff3d1d65a5c23ef693b1
Deleted: sha256:85d57d9f1b685783fb657faad82c569987c70fd053f56c055062f6318dc04fc9
Deleted: sha256:a831b46ba9c1d8db15494f5bbf3e57daa8b7d867941ce1c0c8a46e7e4155c7b9
Deleted: sha256:fcf162342282b136bb7b6f72dd5fcb629aedd622be540201cb2a56bd079b8628
Deleted: sha256:1a037865db1f41761303c36a094eb61ee81cb2dd668cc186464106e908c98d6d
Deleted: sha256:959299ca5acec1334818961249f12b323f80e99cdf6a10990fc8b822fd1beecb
Deleted: sha256:182a2c6fe00c825e65dcc48aad72d7bc91fc544628dceba3f64362a06bb2b83d
Deleted: sha256:05a730a8f86d6fc7eb526041841133ddf7feb512b6b8cb341af1dfe9bd157f73
Deleted: sha256:6acb8d63dcee226436e44bb27b72b3c36a0b5f1cc4b90a7933d17d47dfd0302f
Deleted: sha256:9aaa03a5de2dfbca994aa4130f92e8be2d57ff8eb1db8fd2d48e8f6de8d756ba
Deleted: sha256:62b1e578441e94418fcc6b5644f84c2b743256520ee867e8a9603f1558950a4a
Deleted: sha256:791b3fff9c2cb56a171506e594eadc46404bd6710216c5d04c33cd1d8da57df4
Deleted: sha256:3313c6224fba6a815009d706b0e2f56fc3297a28d3521d242aa0f9087479736e
Deleted: sha256:8766115a00680c46e87d5088fac12210a6a00200c7dd73ebb6ae8fb08df5f8fe
Deleted: sha256:52b58d9c9fd05801a4f97aa1edc1e3ba080325912da21ccaf55fccad9b2066da
Deleted: sha256:9fac7ff9230aea968c486225df5c92fe094a97db6b56aef62609feac0ce63ebe
Deleted: sha256:b04f74f01c98559b002fc2a966d5b9fef7d9b8aa5f66fc7555af326b1e816db9
Deleted: sha256:0be0c5cd7c32a82e8a9ab4379990c82dc2f6cfddc1fbdf1b193aec9069842b5f
Deleted: sha256:81fa0bfb87f7a739f45a5fc2e16c1f967e84573167aa211d9ffdcacc7bd65962
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211026124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c38f0177a7ef3f57599c16a2ef5297bf189e09217ebd02d1afefe1ff41eb7208]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211026124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c38f0177a7ef3f57599c16a2ef5297bf189e09217ebd02d1afefe1ff41eb7208])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c38f0177a7ef3f57599c16a2ef5297bf189e09217ebd02d1afefe1ff41eb7208
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c38f0177a7ef3f57599c16a2ef5297bf189e09217ebd02d1afefe1ff41eb7208
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c38f0177a7ef3f57599c16a2ef5297bf189e09217ebd02d1afefe1ff41eb7208].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 57s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/flrimtg72akcy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #130

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/130/display/redirect?page=changes>

Changes:

[noreply] Fixing BigQueryIO request too big corner case for streaming inserts


------------------------------------------
[...truncated 48.99 KB...]
831f7f5467c5: Preparing
10919810519a: Preparing
cf43989a6960: Preparing
5b29ab1822bb: Preparing
cf9967dcd6ec: Preparing
44ba0f20ac6e: Preparing
612d8cc2e68f: Preparing
07856eba2e32: Preparing
8a20526dc2b3: Preparing
ad69949f2cbf: Preparing
6ab646f5193f: Preparing
9cfc9240f18d: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
44ba0f20ac6e: Waiting
612d8cc2e68f: Waiting
07856eba2e32: Waiting
8a20526dc2b3: Waiting
ba6e5ff31f23: Waiting
9f9f651e9303: Waiting
ad69949f2cbf: Waiting
0b3c02b5d746: Waiting
62a747bf1719: Waiting
62a5b8741e83: Waiting
6ab646f5193f: Waiting
36e0782f1159: Waiting
9cfc9240f18d: Waiting
78700b6b35d0: Waiting
10919810519a: Pushed
cf43989a6960: Pushed
cf9967dcd6ec: Pushed
831f7f5467c5: Pushed
44ba0f20ac6e: Pushed
5b29ab1822bb: Pushed
07856eba2e32: Pushed
8a20526dc2b3: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
612d8cc2e68f: Pushed
ba6e5ff31f23: Layer already exists
6ab646f5193f: Pushed
9cfc9240f18d: Pushed
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
ad69949f2cbf: Pushed
20211025124334: digest: sha256:c08104e2d10664a4efb5b2332b04020be951c9d078fcbc95358524b0acf4c329 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 25, 2021 12:45:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 25, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 25, 2021 12:45:28 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 25, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 25, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 25, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 25, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 25, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 91230eba4866ec41d6661f52d5d707332eaf22f64fe9fb97f55f01e55f3e72c9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kSMOukhm7EHWZh9S1dcHMy6vIvZP6fuX9V8B5V8-csk.pb
Oct 25, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 25, 2021 12:45:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f]
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 25, 2021 12:45:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b]
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-25_05_45_34-11476855775060073623?project=apache-beam-testing
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-25_05_45_34-11476855775060073623
Oct 25, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-25_05_45_34-11476855775060073623
Oct 25, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-25T12:45:40.912Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-uvmg. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 25, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:44.648Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 25, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.375Z: Expanding SplittableParDo operations into optimizable parts.
Oct 25, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.406Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 25, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.461Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.538Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.563Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.639Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.752Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.772Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.816Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.839Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.872Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.906Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.928Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.957Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:45.981Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.018Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.043Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.076Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.110Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.167Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.200Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.226Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.248Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.272Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.303Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.327Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.355Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.379Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.410Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 25, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:46.769Z: Starting 5 ****s in us-central1-a...
Oct 25, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:45:51.212Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 25, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:46:28.879Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 25, 2021 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:47:26.727Z: Workers have started successfully.
Oct 25, 2021 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T12:47:26.757Z: Workers have started successfully.
Oct 25, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T16:00:28.761Z: Cancel request is committed for workflow job: 2021-10-25_05_45_34-11476855775060073623.
Oct 25, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T16:00:28.835Z: Cleaning up.
Oct 25, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T16:00:28.921Z: Stopping **** pool...
Oct 25, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T16:00:28.997Z: Stopping **** pool...
Oct 25, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T16:02:48.560Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 25, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-25T16:02:48.598Z: Worker pool stopped.
Oct 25, 2021 4:02:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-25_05_45_34-11476855775060073623 finished with status CANCELLED.
Load test results for test (ID): 98e688ea-807b-4e48-8998-6aa6c3809566 and timestamp: 2021-10-25T12:45:28.244000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11551.233
dataflow_v2_java11_total_bytes_count             2.46477647E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211025124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c08104e2d10664a4efb5b2332b04020be951c9d078fcbc95358524b0acf4c329
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211025124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c08104e2d10664a4efb5b2332b04020be951c9d078fcbc95358524b0acf4c329]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211025124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c08104e2d10664a4efb5b2332b04020be951c9d078fcbc95358524b0acf4c329])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c08104e2d10664a4efb5b2332b04020be951c9d078fcbc95358524b0acf4c329
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c08104e2d10664a4efb5b2332b04020be951c9d078fcbc95358524b0acf4c329
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c08104e2d10664a4efb5b2332b04020be951c9d078fcbc95358524b0acf4c329].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 42s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/daorjlr55yyuc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #129

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/129/display/redirect>

Changes:


------------------------------------------
[...truncated 48.57 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
b78ae8b444a3: Preparing
62d6b0f560f1: Preparing
814668c08eb1: Preparing
a968cd95a014: Preparing
e96b5b261848: Preparing
788b0e737a68: Preparing
8cc00797d551: Preparing
70217549fe31: Preparing
60ab889751cd: Preparing
1e0b3e8484d3: Preparing
c7b06e92a90f: Preparing
e8980afecadc: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
60ab889751cd: Waiting
788b0e737a68: Waiting
1e0b3e8484d3: Waiting
8cc00797d551: Waiting
c7b06e92a90f: Waiting
70217549fe31: Waiting
e8980afecadc: Waiting
78700b6b35d0: Waiting
62a5b8741e83: Waiting
62a747bf1719: Waiting
36e0782f1159: Waiting
9f9f651e9303: Waiting
ba6e5ff31f23: Waiting
62d6b0f560f1: Pushed
814668c08eb1: Pushed
e96b5b261848: Pushed
788b0e737a68: Pushed
b78ae8b444a3: Pushed
70217549fe31: Pushed
a968cd95a014: Pushed
60ab889751cd: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
c7b06e92a90f: Pushed
e8980afecadc: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
8cc00797d551: Pushed
1e0b3e8484d3: Pushed
20211024124337: digest: sha256:5a124b45284c3cf8a3aefbedde277e9c1caf27422178312ebe7827bfd8bb4a3d size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 24, 2021 12:45:47 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 24, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 24, 2021 12:45:49 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 24, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 24, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 82ef116bc867718c215b522534d48287a8aee43ec7cd3d11a63531af932e4df6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-gu8Ra8hncYwhW1IlNNSCh6iu5D7HzT0RpjUxr5MuTfY.pb
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 24, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@aaa0f76]
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 24, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03]
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 24, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 24, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-24_05_45_53-15648205050698846936?project=apache-beam-testing
Oct 24, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-24_05_45_53-15648205050698846936
Oct 24, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-24_05_45_53-15648205050698846936
Oct 24, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-24T12:46:03.797Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-11tb. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:08.314Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.026Z: Expanding SplittableParDo operations into optimizable parts.
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.050Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.111Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.181Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.211Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.266Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.375Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.412Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.437Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.475Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.506Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.530Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.555Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.578Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.602Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.628Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.663Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.693Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.720Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.753Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.780Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.807Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.828Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.861Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.895Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.928Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.961Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:09.991Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:10.027Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:10.379Z: Starting 5 ****s in us-central1-a...
Oct 24, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:14.214Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 24, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:46:50.812Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 24, 2021 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:47:44.534Z: Workers have started successfully.
Oct 24, 2021 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T12:47:44.556Z: Workers have started successfully.
Oct 24, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T16:00:26.295Z: Cancel request is committed for workflow job: 2021-10-24_05_45_53-15648205050698846936.
Oct 24, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T16:00:26.367Z: Cleaning up.
Oct 24, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T16:00:26.457Z: Stopping **** pool...
Oct 24, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T16:00:26.534Z: Stopping **** pool...
Oct 24, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T16:02:50.100Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 24, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-24T16:02:50.141Z: Worker pool stopped.
Oct 24, 2021 4:02:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-24_05_45_53-15648205050698846936 finished with status CANCELLED.
Load test results for test (ID): ce635266-db81-4432-a184-265a9ed66a37 and timestamp: 2021-10-24T12:45:48.716000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11518.388
dataflow_v2_java11_total_bytes_count              1.5485639E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211024124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5a124b45284c3cf8a3aefbedde277e9c1caf27422178312ebe7827bfd8bb4a3d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211024124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5a124b45284c3cf8a3aefbedde277e9c1caf27422178312ebe7827bfd8bb4a3d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211024124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5a124b45284c3cf8a3aefbedde277e9c1caf27422178312ebe7827bfd8bb4a3d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5a124b45284c3cf8a3aefbedde277e9c1caf27422178312ebe7827bfd8bb4a3d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5a124b45284c3cf8a3aefbedde277e9c1caf27422178312ebe7827bfd8bb4a3d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5a124b45284c3cf8a3aefbedde277e9c1caf27422178312ebe7827bfd8bb4a3d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 38s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4q3tm3yqk3k7m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #128

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/128/display/redirect>

Changes:


------------------------------------------
[...truncated 49.31 KB...]
62a747bf1719: Waiting
0b3c02b5d746: Waiting
9f9f651e9303: Waiting
1813b4329c90: Pushed
5d8f284749f8: Pushed
58999aee1641: Pushed
cf476769792b: Pushed
57b76b70e2e9: Pushed
b61a7f33d3bd: Pushed
661388b38f1f: Pushed
54d9132de0ae: Pushed
1e5bbac8f158: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
2160cddcd869: Pushed
9f9f651e9303: Layer already exists
fa72c85d8524: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
dcddb0ae3ae3: Pushed
20211023124333: digest: sha256:885d360e00e774e70efd4d3431bb813b11e1f2f3ec001fcceb8e64d4c922855b size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 23, 2021 12:45:20 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 23, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 23, 2021 12:45:21 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 23, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 23, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 23, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 23, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 23, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 7150d364fc0b39096a5e436a8ffa17d66347c9d2bc148256d8e76041b4a6df51> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cVDTZPwLOQlqXkNqj_oX1mNHydK8FIJW2OdgQbSm31E.pb
Oct 23, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 23, 2021 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f]
Oct 23, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 23, 2021 12:45:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b]
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-23_05_45_26-11254189268418989776?project=apache-beam-testing
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-23_05_45_26-11254189268418989776
Oct 23, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-23_05_45_26-11254189268418989776
Oct 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-23T12:45:33.710Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-gmis. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:38.016Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:38.753Z: Expanding SplittableParDo operations into optimizable parts.
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:38.792Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:38.867Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:38.939Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:38.965Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.048Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.141Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.182Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.207Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.229Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.264Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.289Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.323Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.348Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.384Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.422Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.455Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.492Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.515Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.536Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.571Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.595Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.619Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.647Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.674Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.706Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.729Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.760Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:39.807Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 23, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:45:40.135Z: Starting 5 ****s in us-central1-a...
Oct 23, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:46:10.073Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 23, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:46:23.009Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 23, 2021 12:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:47:17.865Z: Workers have started successfully.
Oct 23, 2021 12:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T12:47:17.898Z: Workers have started successfully.
Oct 23, 2021 3:42:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:42:40.431Z: Staged package arrow-format-5.0.0-FzXW0Kc0wcM-4qPMnXkmsxGom9LbiofNT9gHF14a_l4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-format-5.0.0-FzXW0Kc0wcM-4qPMnXkmsxGom9LbiofNT9gHF14a_l4.jar' is inaccessible.
Oct 23, 2021 3:42:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:42:40.531Z: Staged package arrow-memory-core-5.0.0-cTMnF35Hj_AzHUn7iWyDPIq8rG9o8ZjREpZtIg17P4M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-memory-core-5.0.0-cTMnF35Hj_AzHUn7iWyDPIq8rG9o8ZjREpZtIg17P4M.jar' is inaccessible.
Oct 23, 2021 3:42:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:42:40.576Z: Staged package arrow-vector-5.0.0-qCLc2jGU2wVSqfy4RSUPuN0xKMz05B4sYNeFxQPEwnI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-vector-5.0.0-qCLc2jGU2wVSqfy4RSUPuN0xKMz05B4sYNeFxQPEwnI.jar' is inaccessible.
Oct 23, 2021 3:42:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-23T15:42:44.341Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 23, 2021 3:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-23T15:45:44.131Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 23, 2021 3:48:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:48:40.486Z: Staged package arrow-format-5.0.0-FzXW0Kc0wcM-4qPMnXkmsxGom9LbiofNT9gHF14a_l4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-format-5.0.0-FzXW0Kc0wcM-4qPMnXkmsxGom9LbiofNT9gHF14a_l4.jar' is inaccessible.
Oct 23, 2021 3:48:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:48:40.568Z: Staged package arrow-memory-core-5.0.0-cTMnF35Hj_AzHUn7iWyDPIq8rG9o8ZjREpZtIg17P4M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-memory-core-5.0.0-cTMnF35Hj_AzHUn7iWyDPIq8rG9o8ZjREpZtIg17P4M.jar' is inaccessible.
Oct 23, 2021 3:48:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:48:40.600Z: Staged package arrow-vector-5.0.0-qCLc2jGU2wVSqfy4RSUPuN0xKMz05B4sYNeFxQPEwnI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-vector-5.0.0-qCLc2jGU2wVSqfy4RSUPuN0xKMz05B4sYNeFxQPEwnI.jar' is inaccessible.
Oct 23, 2021 3:48:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-23T15:48:44.120Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 23, 2021 3:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-23T15:51:43.991Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 23, 2021 3:54:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:54:40.447Z: Staged package arrow-format-5.0.0-FzXW0Kc0wcM-4qPMnXkmsxGom9LbiofNT9gHF14a_l4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-format-5.0.0-FzXW0Kc0wcM-4qPMnXkmsxGom9LbiofNT9gHF14a_l4.jar' is inaccessible.
Oct 23, 2021 3:54:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:54:40.488Z: Staged package arrow-memory-core-5.0.0-cTMnF35Hj_AzHUn7iWyDPIq8rG9o8ZjREpZtIg17P4M.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-memory-core-5.0.0-cTMnF35Hj_AzHUn7iWyDPIq8rG9o8ZjREpZtIg17P4M.jar' is inaccessible.
Oct 23, 2021 3:54:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-23T15:54:40.565Z: Staged package arrow-vector-5.0.0-qCLc2jGU2wVSqfy4RSUPuN0xKMz05B4sYNeFxQPEwnI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/arrow-vector-5.0.0-qCLc2jGU2wVSqfy4RSUPuN0xKMz05B4sYNeFxQPEwnI.jar' is inaccessible.
Oct 23, 2021 3:54:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-23T15:54:44.183Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 23, 2021 3:57:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-23T15:57:44.040Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 23, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T16:00:33.057Z: Cancel request is committed for workflow job: 2021-10-23_05_45_26-11254189268418989776.
Oct 23, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T16:00:33.085Z: Cleaning up.
Oct 23, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T16:00:33.159Z: Stopping **** pool...
Oct 23, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T16:00:33.217Z: Stopping **** pool...
Oct 23, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T16:02:57.201Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 23, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-23T16:02:57.236Z: Worker pool stopped.
Oct 23, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-23_05_45_26-11254189268418989776 finished with status CANCELLED.
Load test results for test (ID): 2e74a720-71a4-4c83-ab57-addad6fa1cad and timestamp: 2021-10-23T12:45:21.373000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11554.335
dataflow_v2_java11_total_bytes_count             2.49740768E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211023124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:885d360e00e774e70efd4d3431bb813b11e1f2f3ec001fcceb8e64d4c922855b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211023124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:885d360e00e774e70efd4d3431bb813b11e1f2f3ec001fcceb8e64d4c922855b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211023124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:885d360e00e774e70efd4d3431bb813b11e1f2f3ec001fcceb8e64d4c922855b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:885d360e00e774e70efd4d3431bb813b11e1f2f3ec001fcceb8e64d4c922855b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:885d360e00e774e70efd4d3431bb813b11e1f2f3ec001fcceb8e64d4c922855b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:885d360e00e774e70efd4d3431bb813b11e1f2f3ec001fcceb8e64d4c922855b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v6q6rf57l3neo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #127

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/127/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15761 from [BEAM-13008] Create gradle tasks for the

[noreply] [BEAM-13096] Double test timeout. (#15774)

[noreply] [BEAM-13019] Add `containsInAnyOrder` with matchers to the

[Daniel Oliveira] Avoiding read-only Go module cache in Gradle config.

[noreply] [BEAM-11758] Update basics page: Pipeline, PCollection, PTransform

[noreply] Test SetState addIfAbsent with no read (#15776)

[noreply] lazy creation of source splits for export-based ReadFromBigQuery

[noreply] [BEAM-11275] Support remote package download from remote filesystems in

[noreply] [BEAM-13015] Create a multiplexer that sends Elements based upon


------------------------------------------
[...truncated 53.59 KB...]
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 22, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 22, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 22, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-22_05_45_27-8754089351573128322?project=apache-beam-testing
Oct 22, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-22_05_45_27-8754089351573128322
Oct 22, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-22_05_45_27-8754089351573128322
Oct 22, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-22T12:45:40.746Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-2smy. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 22, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:45.008Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:45.769Z: Expanding SplittableParDo operations into optimizable parts.
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:45.790Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:45.839Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:45.898Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:45.929Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:45.988Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.089Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.127Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.159Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.194Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.219Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.242Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.268Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.298Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.334Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.357Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.390Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.413Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.430Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.456Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.478Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.510Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.542Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.567Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.594Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.623Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.644Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.677Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:46.708Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 22, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:45:47.043Z: Starting 5 ****s in us-central1-a...
Oct 22, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:46:03.956Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 22, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:46:31.151Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 22, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:47:23.177Z: Workers have started successfully.
Oct 22, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T12:47:23.204Z: Workers have started successfully.
Oct 22, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:47.504Z: Staged package aws-java-sdk-cloudwatch-1.11.974-Ub2yicrz1p27ushPndFcVBigc3SuIZ4G-Yaq45B1iUo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-cloudwatch-1.11.974-Ub2yicrz1p27ushPndFcVBigc3SuIZ4G-Yaq45B1iUo.jar' is inaccessible.
Oct 22, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:47.578Z: Staged package aws-java-sdk-core-1.11.974-s3lPhICKsq1xExGOyj03VQHRqOsuX5YUc-UUlnhiA64.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-core-1.11.974-s3lPhICKsq1xExGOyj03VQHRqOsuX5YUc-UUlnhiA64.jar' is inaccessible.
Oct 22, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:47.616Z: Staged package aws-java-sdk-dynamodb-1.11.974-RLvIDrX-TDIDN06gNnaHm9dQbuHCD_XSBesOuniSTk4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-dynamodb-1.11.974-RLvIDrX-TDIDN06gNnaHm9dQbuHCD_XSBesOuniSTk4.jar' is inaccessible.
Oct 22, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:47.674Z: Staged package aws-java-sdk-kinesis-1.11.974-kbU_CjbRpkrhhFeoyIMQA81nTYmniJY77w8Smd7PLpY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kinesis-1.11.974-kbU_CjbRpkrhhFeoyIMQA81nTYmniJY77w8Smd7PLpY.jar' is inaccessible.
Oct 22, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:47.728Z: Staged package aws-java-sdk-kms-1.11.974-HWH_mHclw3Wv4_GJ56x62xFDHSPSkZWjeKAlx40XrTk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kms-1.11.974-HWH_mHclw3Wv4_GJ56x62xFDHSPSkZWjeKAlx40XrTk.jar' is inaccessible.
Oct 22, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:47.768Z: Staged package aws-java-sdk-s3-1.11.974-Qgd0wgaOuI_uyxxQI4Ulljk9bZEq-3JZArQF0zU5yzA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-s3-1.11.974-Qgd0wgaOuI_uyxxQI4Ulljk9bZEq-3JZArQF0zU5yzA.jar' is inaccessible.
Oct 22, 2021 3:51:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:48.423Z: Staged package commons-io-2.6--HfTBGYKwqFC84ZbrfyXHex-1zx0fH-NXS9ROcpzZRM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-io-2.6--HfTBGYKwqFC84ZbrfyXHex-1zx0fH-NXS9ROcpzZRM.jar' is inaccessible.
Oct 22, 2021 3:51:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:48.458Z: Staged package commons-lang-2.6-UPEbCfh3wpTVbyRGP0fSj5Kc9QRPZIZhwPDPuumi9Jw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang-2.6-UPEbCfh3wpTVbyRGP0fSj5Kc9QRPZIZhwPDPuumi9Jw.jar' is inaccessible.
Oct 22, 2021 3:51:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:48.605Z: Staged package error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar' is inaccessible.
Oct 22, 2021 3:51:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:49.908Z: Staged package ion-java-1.0.2-DRJ7IFofzgq8KjdXoEF0hlG8ZsFc9MBZusWDOyfUcaU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/ion-java-1.0.2-DRJ7IFofzgq8KjdXoEF0hlG8ZsFc9MBZusWDOyfUcaU.jar' is inaccessible.
Oct 22, 2021 3:51:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:50.036Z: Staged package jackson-dataformat-cbor-2.12.4-xpFy2QEl4dUh55AUzgOueyVcIoNGuAdPK6G3l4P-yCc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.12.4-xpFy2QEl4dUh55AUzgOueyVcIoNGuAdPK6G3l4P-yCc.jar' is inaccessible.
Oct 22, 2021 3:51:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:51:50.186Z: Staged package jmespath-java-1.11.974-QaUt5Hnw6wd7ULTQ5Oj4j_zpRr-rnMCdfGwiwkSo8-0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jmespath-java-1.11.974-QaUt5Hnw6wd7ULTQ5Oj4j_zpRr-rnMCdfGwiwkSo8-0.jar' is inaccessible.
Oct 22, 2021 3:51:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-22T15:51:51.457Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 22, 2021 3:54:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-22T15:54:50.652Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:47.484Z: Staged package aws-java-sdk-cloudwatch-1.11.974-Ub2yicrz1p27ushPndFcVBigc3SuIZ4G-Yaq45B1iUo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-cloudwatch-1.11.974-Ub2yicrz1p27ushPndFcVBigc3SuIZ4G-Yaq45B1iUo.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:47.521Z: Staged package aws-java-sdk-core-1.11.974-s3lPhICKsq1xExGOyj03VQHRqOsuX5YUc-UUlnhiA64.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-core-1.11.974-s3lPhICKsq1xExGOyj03VQHRqOsuX5YUc-UUlnhiA64.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:47.568Z: Staged package aws-java-sdk-dynamodb-1.11.974-RLvIDrX-TDIDN06gNnaHm9dQbuHCD_XSBesOuniSTk4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-dynamodb-1.11.974-RLvIDrX-TDIDN06gNnaHm9dQbuHCD_XSBesOuniSTk4.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:47.618Z: Staged package aws-java-sdk-kinesis-1.11.974-kbU_CjbRpkrhhFeoyIMQA81nTYmniJY77w8Smd7PLpY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kinesis-1.11.974-kbU_CjbRpkrhhFeoyIMQA81nTYmniJY77w8Smd7PLpY.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:47.668Z: Staged package aws-java-sdk-kms-1.11.974-HWH_mHclw3Wv4_GJ56x62xFDHSPSkZWjeKAlx40XrTk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-kms-1.11.974-HWH_mHclw3Wv4_GJ56x62xFDHSPSkZWjeKAlx40XrTk.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:47.713Z: Staged package aws-java-sdk-s3-1.11.974-Qgd0wgaOuI_uyxxQI4Ulljk9bZEq-3JZArQF0zU5yzA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/aws-java-sdk-s3-1.11.974-Qgd0wgaOuI_uyxxQI4Ulljk9bZEq-3JZArQF0zU5yzA.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:48.366Z: Staged package commons-io-2.6--HfTBGYKwqFC84ZbrfyXHex-1zx0fH-NXS9ROcpzZRM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-io-2.6--HfTBGYKwqFC84ZbrfyXHex-1zx0fH-NXS9ROcpzZRM.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:48.417Z: Staged package commons-lang-2.6-UPEbCfh3wpTVbyRGP0fSj5Kc9QRPZIZhwPDPuumi9Jw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang-2.6-UPEbCfh3wpTVbyRGP0fSj5Kc9QRPZIZhwPDPuumi9Jw.jar' is inaccessible.
Oct 22, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:48.571Z: Staged package error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/error_prone_annotations-2.3.4-uvfW6pfOYGxT4RtoVLpfLOfvXCTd3wr6GNEmC9JbACw.jar' is inaccessible.
Oct 22, 2021 3:57:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:49.949Z: Staged package ion-java-1.0.2-DRJ7IFofzgq8KjdXoEF0hlG8ZsFc9MBZusWDOyfUcaU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/ion-java-1.0.2-DRJ7IFofzgq8KjdXoEF0hlG8ZsFc9MBZusWDOyfUcaU.jar' is inaccessible.
Oct 22, 2021 3:57:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:50.097Z: Staged package jackson-dataformat-cbor-2.12.4-xpFy2QEl4dUh55AUzgOueyVcIoNGuAdPK6G3l4P-yCc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.12.4-xpFy2QEl4dUh55AUzgOueyVcIoNGuAdPK6G3l4P-yCc.jar' is inaccessible.
Oct 22, 2021 3:57:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-22T15:57:50.281Z: Staged package jmespath-java-1.11.974-QaUt5Hnw6wd7ULTQ5Oj4j_zpRr-rnMCdfGwiwkSo8-0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jmespath-java-1.11.974-QaUt5Hnw6wd7ULTQ5Oj4j_zpRr-rnMCdfGwiwkSo8-0.jar' is inaccessible.
Oct 22, 2021 3:57:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-22T15:57:51.242Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 22, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T16:00:30.705Z: Cancel request is committed for workflow job: 2021-10-22_05_45_27-8754089351573128322.
Oct 22, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T16:00:30.739Z: Cleaning up.
Oct 22, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T16:00:30.821Z: Stopping **** pool...
Oct 22, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T16:00:30.901Z: Stopping **** pool...
Oct 22, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T16:02:50.988Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 22, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-22T16:02:51.059Z: Worker pool stopped.
Oct 22, 2021 4:02:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-22_05_45_27-8754089351573128322 finished with status CANCELLED.
Load test results for test (ID): 461d56ac-b960-479f-83e1-9186f0683a7e and timestamp: 2021-10-22T12:45:22.419000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11519.193
dataflow_v2_java11_total_bytes_count             2.22514959E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fde408f783525f15f7516cdd3955d0a75837d3b11f9031be4247590ff832a8a7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fde408f783525f15f7516cdd3955d0a75837d3b11f9031be4247590ff832a8a7]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fde408f783525f15f7516cdd3955d0a75837d3b11f9031be4247590ff832a8a7])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fde408f783525f15f7516cdd3955d0a75837d3b11f9031be4247590ff832a8a7
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fde408f783525f15f7516cdd3955d0a75837d3b11f9031be4247590ff832a8a7
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fde408f783525f15f7516cdd3955d0a75837d3b11f9031be4247590ff832a8a7].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:600e0b7a283b0c8b376be8fe9960594503735a5f3a285f91083d3aec31b13b63
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:600e0b7a283b0c8b376be8fe9960594503735a5f3a285f91083d3aec31b13b63
ERROR: (gcloud.container.images.delete) Not found: response: {'status': '404', 'content-length': '168', 'x-xss-protection': '0', 'transfer-encoding': 'chunked', 'server': 'Docker Registry', '-content-encoding': 'gzip', 'docker-distribution-api-version': 'registry/2.0', 'cache-control': 'private', 'date': 'Fri, 22 Oct 2021 16:03:00 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
Failed to compute blob liveness for manifest: 'sha256:600e0b7a283b0c8b376be8fe9960594503735a5f3a285f91083d3aec31b13b63': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 282

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 40s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ao3ubinetr776

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #126

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/126/display/redirect?page=changes>

Changes:

[noreply] Add -XX:+AlwaysActAsServerClassMachine to Java SDK container

[moritz] adhoc: Minor update to flink runner docs

[noreply] [BEAM-11087] Add default WindowMappingFn from Main to Side Input

[noreply] [BEAM-13082] Re-use dataWriter buffer. (#15762)


------------------------------------------
[...truncated 48.66 KB...]
27d7589082a3: Preparing
df718bf95b1f: Preparing
fdbee3fc0ed7: Preparing
13f89c075c80: Preparing
c2e45a9a6908: Preparing
211736d7fc9d: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
fdbee3fc0ed7: Waiting
13f89c075c80: Waiting
c2e45a9a6908: Waiting
9f9f651e9303: Waiting
0b3c02b5d746: Waiting
211736d7fc9d: Waiting
62a5b8741e83: Waiting
b46796526af9: Waiting
36e0782f1159: Waiting
ba6e5ff31f23: Waiting
62a747bf1719: Waiting
df718bf95b1f: Waiting
7f341d529cec: Waiting
27d7589082a3: Waiting
e75773a91867: Pushed
c2cd88aa4da1: Pushed
d81f99b93918: Pushed
bf560bac8f42: Pushed
7f341d529cec: Pushed
27d7589082a3: Pushed
9efe5625e8ba: Pushed
df718bf95b1f: Pushed
211736d7fc9d: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
b46796526af9: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
13f89c075c80: Pushed
c2e45a9a6908: Pushed
fdbee3fc0ed7: Pushed
20211021124334: digest: sha256:050ac8bca4e318f4189ef746a2f9d28ab1c6c5c69b77e4460d07ae18147826dc size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 21, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 21, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 21, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 21, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 21, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 21, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 21, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 21, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 6e8f69c3b865aabbc467c2863b517519c648dd1e38a1a9650bdb957396b1aecf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bo9pw7hlqrvEZ8KGO1F1GcZI3R44oallC9uVc5axrs8.pb
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 21, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f]
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 21, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b]
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 21, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-21_05_45_27-12856435589279297628?project=apache-beam-testing
Oct 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-21_05_45_27-12856435589279297628
Oct 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-21_05_45_27-12856435589279297628
Oct 21, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-21T12:45:33.549Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-sgei. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 21, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:42.447Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.223Z: Expanding SplittableParDo operations into optimizable parts.
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.247Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.318Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.392Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.459Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.519Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.623Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.660Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.698Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.726Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.752Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.795Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.826Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.860Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.889Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.927Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.961Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:43.995Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.025Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.049Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.083Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.128Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.150Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.177Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.205Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.229Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.253Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.284Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.317Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 21, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:45:44.700Z: Starting 5 ****s in us-central1-a...
Oct 21, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:46:01.901Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 21, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:46:24.356Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 21, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:47:28.300Z: Workers have started successfully.
Oct 21, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T12:47:28.333Z: Workers have started successfully.
Oct 21, 2021 2:11:48 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Oct 21, 2021 3:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-21T15:53:43.830Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
Oct 21, 2021 4:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T16:01:00.167Z: Cancel request is committed for workflow job: 2021-10-21_05_45_27-12856435589279297628.
Oct 21, 2021 4:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T16:01:00.210Z: Cleaning up.
Oct 21, 2021 4:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T16:01:00.299Z: Stopping **** pool...
Oct 21, 2021 4:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T16:01:00.353Z: Stopping **** pool...
Oct 21, 2021 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T16:03:28.019Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 21, 2021 4:03:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-21T16:03:28.111Z: Worker pool stopped.
Oct 21, 2021 4:03:35 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-21_05_45_27-12856435589279297628 finished with status CANCELLED.
Load test results for test (ID): 40bd8b67-6ded-44c3-948c-8e43c1b7f652 and timestamp: 2021-10-21T12:45:22.796000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11555.628
dataflow_v2_java11_total_bytes_count             2.51379893E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211021124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:050ac8bca4e318f4189ef746a2f9d28ab1c6c5c69b77e4460d07ae18147826dc
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211021124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:050ac8bca4e318f4189ef746a2f9d28ab1c6c5c69b77e4460d07ae18147826dc]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211021124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:050ac8bca4e318f4189ef746a2f9d28ab1c6c5c69b77e4460d07ae18147826dc])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:050ac8bca4e318f4189ef746a2f9d28ab1c6c5c69b77e4460d07ae18147826dc
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:050ac8bca4e318f4189ef746a2f9d28ab1c6c5c69b77e4460d07ae18147826dc
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:050ac8bca4e318f4189ef746a2f9d28ab1c6c5c69b77e4460d07ae18147826dc].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 20m 19s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cfsjnax2g2pws

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #125

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/125/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-13042] Prevent unexpected blocking in

[ilya.kozyrev] Implement initial server with grpcweb wrapper

[aydar.zaynutdinov] [BEAM_13077][Playground]

[egalpin] [BEAM-10990] Adds response filtering for ElasticsearchIO

[egalpin] [BEAM-5172] Tries to reduce ES uTest flakiness

[noreply] [BEAM-13068] Add a SQL API in Beam Go SDK (#15746)

[Kyle Weaver] [BEAM-13055] Use unshallow clone to create PR.

[noreply] [BEAM-13079] Updates cross-language transform URNs to use the new


------------------------------------------
[...truncated 49.57 KB...]
a1d4aca25996: Preparing
d3f5757701a4: Preparing
6fdee1d11a8b: Preparing
c59a11cff868: Preparing
fa54838fc732: Preparing
aa6ce0101937: Preparing
1ea7135b02a7: Preparing
3c762d54a262: Preparing
211736d7fc9d: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
c59a11cff868: Waiting
3c762d54a262: Waiting
1ea7135b02a7: Waiting
9f9f651e9303: Waiting
fa54838fc732: Waiting
62a5b8741e83: Waiting
0b3c02b5d746: Waiting
aa6ce0101937: Waiting
36e0782f1159: Waiting
62a747bf1719: Waiting
ba6e5ff31f23: Waiting
d3f5757701a4: Waiting
6fdee1d11a8b: Waiting
8b0817e7e651: Pushed
ce09958f4a5a: Pushed
a1d4aca25996: Pushed
d3f5757701a4: Pushed
505ede38669d: Pushed
c59a11cff868: Pushed
e054861bdeb5: Pushed
fa54838fc732: Pushed
1ea7135b02a7: Pushed
6fdee1d11a8b: Pushed
211736d7fc9d: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
3c762d54a262: Pushed
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
62a747bf1719: Layer already exists
0b3c02b5d746: Layer already exists
aa6ce0101937: Pushed
20211020124336: digest: sha256:c99d7dfb89e708a2d4953010321e187e0e4e0a1065f5b207fcb4e0b004a8d204 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 20, 2021 12:45:24 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 20, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 20, 2021 12:45:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 20, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 20, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 20, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 20, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 20, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 5bb77f16ed7b654ea65c4a37191e7b3d0bae694f379fa1116a1cf2c6b79634b9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-W7d_Fu17ZU6mXEo3GR57PQuuaU83n6ERahzyxreWNLk.pb
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 20, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f]
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 20, 2021 12:45:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b]
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 20, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 20, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-20_05_45_30-8985529173814672987?project=apache-beam-testing
Oct 20, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-20_05_45_30-8985529173814672987
Oct 20, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-20_05_45_30-8985529173814672987
Oct 20, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-20T12:45:38.043Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-k46o. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:43.410Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.107Z: Expanding SplittableParDo operations into optimizable parts.
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.128Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.186Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.256Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.277Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.338Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.486Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 20, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.513Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.537Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.561Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.596Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.694Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.739Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.798Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.844Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.891Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.932Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:44.964Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.001Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.025Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.068Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.105Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.128Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.153Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.179Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.205Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.239Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.262Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.286Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 20, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:45:45.612Z: Starting 5 ****s in us-central1-a...
Oct 20, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:46:07.329Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 20, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:46:30.487Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 20, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:47:23.889Z: Workers have started successfully.
Oct 20, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T12:47:23.927Z: Workers have started successfully.
Oct 20, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T16:00:27.510Z: Cancel request is committed for workflow job: 2021-10-20_05_45_30-8985529173814672987.
Oct 20, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T16:00:27.969Z: Cleaning up.
Oct 20, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T16:00:28.043Z: Stopping **** pool...
Oct 20, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T16:00:28.112Z: Stopping **** pool...
Oct 20, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-20T16:00:36.518Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
Oct 20, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T16:02:51.502Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 20, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-20T16:02:51.536Z: Worker pool stopped.
Oct 20, 2021 4:02:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-20_05_45_30-8985529173814672987 finished with status CANCELLED.
Load test results for test (ID): 22762324-71b2-4662-9b2b-f910527e588a and timestamp: 2021-10-20T12:45:25.606000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11561.399
dataflow_v2_java11_total_bytes_count             2.57977194E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211020124336
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c99d7dfb89e708a2d4953010321e187e0e4e0a1065f5b207fcb4e0b004a8d204
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211020124336]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c99d7dfb89e708a2d4953010321e187e0e4e0a1065f5b207fcb4e0b004a8d204]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211020124336] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c99d7dfb89e708a2d4953010321e187e0e4e0a1065f5b207fcb4e0b004a8d204])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c99d7dfb89e708a2d4953010321e187e0e4e0a1065f5b207fcb4e0b004a8d204
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c99d7dfb89e708a2d4953010321e187e0e4e0a1065f5b207fcb4e0b004a8d204
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c99d7dfb89e708a2d4953010321e187e0e4e0a1065f5b207fcb4e0b004a8d204].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 42s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/i4w5smwlnqsey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #124

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/124/display/redirect?page=changes>

Changes:

[zhoufek] [BEAM-9487] Multiple Trigger.may_lose_data fixes

[zhoufek] [BEAM-9487] Remove CONDITION_NOT_GUARANTEED as potential data loss

[zhoufek] [BEAM-9487] Do AfterAny, AfterAll, and AfterEach checks properly (i.e.

[zhoufek] [BEAM-9487] Remove unused import

[zhoufek] [BEAM-9487] Reintroduce flag but do not use it

[zhoufek] [BEAM-9487] Add test that shows AfterCount finishing

[zhoufek] [BEAM-9487] Make _ParallelTriggerFn.may_finish clearer

[Robert Bradshaw] Revert "Merge pull request #15441 from [BEAM-8823] Make FnApiRunner work

[Robert Bradshaw] [BEAM-13040] Add some test cases enforcing side input waiting.

[Robert Bradshaw] lint

[brachipa] [BEAM-12393] sql support for Zeta Sql

[aydar.zaynutdinov] [BEAM-12988] [Playground]

[brachipa] [BEAM-12393] package private

[brachipa] [BEAM-12393] returning more generic interface

[noreply] [BEAM-13066] Produce abstract iterables from IterableCoder. (#15662)

[noreply] [BEAM-11936] Fix some errorprone warnings (#15648)

[noreply] [BEAM-13068] Add xlangx.DecodeStructPayload (#15741)

[Luke Cwik] [BEAM-13015] Implement a simplified cancellable blocking queue with


------------------------------------------
[...truncated 48.75 KB...]
1d979abf3de2: Preparing
289798ca79de: Preparing
f750d5c064cb: Preparing
d19373d00b9e: Preparing
b93a3cfa0410: Preparing
c21938f48ffa: Preparing
211736d7fc9d: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
289798ca79de: Waiting
9f9f651e9303: Waiting
0b3c02b5d746: Waiting
f750d5c064cb: Waiting
62a747bf1719: Waiting
ba6e5ff31f23: Waiting
d19373d00b9e: Waiting
c21938f48ffa: Waiting
b93a3cfa0410: Waiting
36e0782f1159: Waiting
62a5b8741e83: Waiting
237c7227b067: Waiting
1d979abf3de2: Waiting
211736d7fc9d: Waiting
e54700390d44: Pushed
15dda34baecf: Pushed
8c1506596a62: Pushed
10f2325e684e: Pushed
237c7227b067: Pushed
b97be556d78d: Pushed
289798ca79de: Pushed
f750d5c064cb: Pushed
211736d7fc9d: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
9f9f651e9303: Layer already exists
b93a3cfa0410: Pushed
0b3c02b5d746: Layer already exists
c21938f48ffa: Pushed
62a747bf1719: Layer already exists
1d979abf3de2: Pushed
d19373d00b9e: Pushed
20211019124332: digest: sha256:d91e16e0bd057c6131ee1f035da528600156546eee09db267f179a80e445ef1b size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 19, 2021 12:45:26 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 19, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 19, 2021 12:45:27 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 19, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 19, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 19, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 19, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 19, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 555cfa9d337e4b57346b563d5ebcc30af3d429d6d26afa106e3c27f11463c136> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VVz6nTN-S1c0a1Y9XrzDCvPUKdbSavoQbjwn8RRjwTY.pb
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 19, 2021 12:45:31 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 19, 2021 12:45:31 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 19, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 19, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-19_05_45_31-9238490290909742448?project=apache-beam-testing
Oct 19, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-19_05_45_31-9238490290909742448
Oct 19, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-19_05_45_31-9238490290909742448
Oct 19, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-19T12:45:43.025Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-watl. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:47.853Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:48.708Z: Expanding SplittableParDo operations into optimizable parts.
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:48.742Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:48.798Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:48.869Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:48.898Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:48.952Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.055Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.095Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.160Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.194Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.226Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.258Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.294Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.327Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.359Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.393Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.425Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.458Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.495Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.516Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.549Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.572Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.604Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.636Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.671Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.705Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.744Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.776Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:49.809Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 19, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:45:50.218Z: Starting 5 ****s in us-central1-a...
Oct 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:46:01.361Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 19, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:46:26.099Z: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 19, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:46:26.134Z: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
Oct 19, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:46:36.426Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 19, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:47:28.817Z: Workers have started successfully.
Oct 19, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T12:47:28.846Z: Workers have started successfully.
Oct 19, 2021 2:59:38 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Oct 19, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T16:00:29.077Z: Cancel request is committed for workflow job: 2021-10-19_05_45_31-9238490290909742448.
Oct 19, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T16:00:29.153Z: Cleaning up.
Oct 19, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T16:00:29.252Z: Stopping **** pool...
Oct 19, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T16:00:29.309Z: Stopping **** pool...
Oct 19, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T16:02:48.758Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 19, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-19T16:02:48.794Z: Worker pool stopped.
Oct 19, 2021 4:02:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-19_05_45_31-9238490290909742448 finished with status CANCELLED.
Load test results for test (ID): d5a994cd-f589-493b-8f96-f8f5246c719b and timestamp: 2021-10-19T12:45:27.262000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11532.16
dataflow_v2_java11_total_bytes_count             3.33109931E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211019124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d91e16e0bd057c6131ee1f035da528600156546eee09db267f179a80e445ef1b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211019124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d91e16e0bd057c6131ee1f035da528600156546eee09db267f179a80e445ef1b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211019124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d91e16e0bd057c6131ee1f035da528600156546eee09db267f179a80e445ef1b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d91e16e0bd057c6131ee1f035da528600156546eee09db267f179a80e445ef1b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d91e16e0bd057c6131ee1f035da528600156546eee09db267f179a80e445ef1b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d91e16e0bd057c6131ee1f035da528600156546eee09db267f179a80e445ef1b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 40s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kmcjt2z7qm2fm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #123

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/123/display/redirect>

Changes:


------------------------------------------
[...truncated 56.61 KB...]
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 18, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 18, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-18_05_45_31-14203049816433255547?project=apache-beam-testing
Oct 18, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-18_05_45_31-14203049816433255547
Oct 18, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-18_05_45_31-14203049816433255547
Oct 18, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-18T12:45:38.957Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-sqas. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:43.143Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:43.857Z: Expanding SplittableParDo operations into optimizable parts.
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:43.882Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:43.949Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.014Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.039Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.095Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.187Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.208Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.235Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.260Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.292Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.324Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.363Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.390Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.416Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.431Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.501Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.530Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.565Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.597Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.624Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.656Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.689Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.714Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.741Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.785Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.814Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.851Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 18, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:44.881Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 18, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:45:45.244Z: Starting 5 ****s in us-central1-a...
Oct 18, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:46:14.278Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:46:36.203Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 18, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:47:37.482Z: Workers have started successfully.
Oct 18, 2021 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T12:47:37.509Z: Workers have started successfully.
Oct 18, 2021 3:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:46.425Z: Staged package checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar' is inaccessible.
Oct 18, 2021 3:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:47.367Z: Staged package google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar' is inaccessible.
Oct 18, 2021 3:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:47.409Z: Staged package google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar' is inaccessible.
Oct 18, 2021 3:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:47.454Z: Staged package google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar' is inaccessible.
Oct 18, 2021 3:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:47.505Z: Staged package google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar' is inaccessible.
Oct 18, 2021 3:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:47.580Z: Staged package google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar' is inaccessible.
Oct 18, 2021 3:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:47.644Z: Staged package google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar' is inaccessible.
Oct 18, 2021 3:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:48.974Z: Staged package opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar' is inaccessible.
Oct 18, 2021 3:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:45:49.550Z: Staged package zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar' is inaccessible.
Oct 18, 2021 3:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-18T15:45:49.590Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 18, 2021 3:48:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-18T15:48:48.937Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 18, 2021 3:51:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:46.458Z: Staged package checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar' is inaccessible.
Oct 18, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:47.619Z: Staged package google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar' is inaccessible.
Oct 18, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:47.659Z: Staged package google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar' is inaccessible.
Oct 18, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:47.694Z: Staged package google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar' is inaccessible.
Oct 18, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:47.745Z: Staged package google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar' is inaccessible.
Oct 18, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:47.792Z: Staged package google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar' is inaccessible.
Oct 18, 2021 3:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:47.841Z: Staged package google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar' is inaccessible.
Oct 18, 2021 3:51:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:51.684Z: Staged package opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar' is inaccessible.
Oct 18, 2021 3:51:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:51:52.173Z: Staged package zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar' is inaccessible.
Oct 18, 2021 3:51:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-18T15:51:52.196Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 18, 2021 3:54:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-18T15:54:49.008Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 18, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:46.185Z: Staged package checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar' is inaccessible.
Oct 18, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:46.940Z: Staged package google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar' is inaccessible.
Oct 18, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:46.987Z: Staged package google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar' is inaccessible.
Oct 18, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:47.029Z: Staged package google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar' is inaccessible.
Oct 18, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:47.066Z: Staged package google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar' is inaccessible.
Oct 18, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:47.101Z: Staged package google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar' is inaccessible.
Oct 18, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:47.135Z: Staged package google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar' is inaccessible.
Oct 18, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:48.240Z: Staged package opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar' is inaccessible.
Oct 18, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-18T15:57:48.720Z: Staged package zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar' is inaccessible.
Oct 18, 2021 3:57:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-18T15:57:48.744Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 18, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T16:00:25.846Z: Cancel request is committed for workflow job: 2021-10-18_05_45_31-14203049816433255547.
Oct 18, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T16:00:25.920Z: Cleaning up.
Oct 18, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T16:00:26.018Z: Stopping **** pool...
Oct 18, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T16:00:26.077Z: Stopping **** pool...
Oct 18, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T16:02:50.941Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 18, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-18T16:02:50.979Z: Worker pool stopped.
Oct 18, 2021 4:02:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-18_05_45_31-14203049816433255547 finished with status CANCELLED.
Load test results for test (ID): 43d5a22c-7d4e-467d-b8ba-360457bd8fb0 and timestamp: 2021-10-18T12:45:26.825000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11541.005
dataflow_v2_java11_total_bytes_count             2.66347914E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211018124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4df8ffb0b0f0f953f605731eb08a71de653af8fe1b3e9b93e4f42638fc4a24ba
Deleted: sha256:a9c8e6a10e40c6458c674ad3df0f71f874203fde2d4f097f76e4ca126c5f5bd7
Deleted: sha256:3dbaaf904543b89a81c99997561f9fc1e1ca04d0465c13a9cbeffb7117a418ed
Deleted: sha256:9be7aff646e3ecac7ffe5b5729cb2f10468ec343294a5e6f89a18c0f84538b56
Deleted: sha256:17dc3c4c167c47ec274269d5a7e338afa813d3a86c1cb1ee730a3d4c685dde9c
Deleted: sha256:7d7172892a76fa790a898b2b8225479efe3d59f0038cfe69b08af3b3f9e1625b
Deleted: sha256:6e94d0611cec08dbe29f947ec16346fff79538800eb712873a0a20b7b7b65283
Deleted: sha256:12a0a42cefa779ad517829f79c3b844598a7378b8c32d699f6eafc70b17f8d62
Deleted: sha256:cd9835914c02a5d47c5c13a47ad0299c215d3b70a986e0ac9f2a7bcc0d05f388
Deleted: sha256:b969294e2397f76eab4d2a31e5c95404627e4133f65a5c39d4cd120471f9bff8
Deleted: sha256:35bc0b3e42753c7df41a4fe00330a9c0c74ef569077002af11b98ca45de41a98
Deleted: sha256:d888356de3ffc6ab1621e26e546f78e96a903e24a948443834a2578d808312f7
Deleted: sha256:1e5f414dd9725a4a7aca61ee9b38e356995bc8d62454f3aacd414deb4c2a9659
Deleted: sha256:0b91fe418ba6384286cab87ebe78460c1adc96cf39684041f0ba53c27d2eef2a
Deleted: sha256:ed6dc7e7255bc6a359c7cc86243dcd53f8a67c29168767b9be9fedc5f29ebfff
Deleted: sha256:8852fc44a380cf00ad8b3aff3e65f40b368191cdabf6a026e6efdc79cbd30051
Deleted: sha256:9ae3ec8d051530b90a6ea38063bcef8e42b01742c5236195c7c2d71a839db82c
Deleted: sha256:2e0b196ed93369c3f6c1713b45347ac214fa5d848e9e2b04245ec7a980884c9f
Deleted: sha256:d9127b1356b05acc490eddd987582d0db8b2ac018747b3dd0dc4a09c28c56048
Deleted: sha256:9884a28cd1a12d5a25c815afc824da9c1462284ceee324befebaaa04ec1ebbce
Deleted: sha256:581a651601c9058540e65efc34628708043a174c672613ec6e9f701f9aba5769
Deleted: sha256:56abf364cb5b331fe81c2e0ed53998a978e4b867ac646c6884af15cc6039c914
Deleted: sha256:3a9c8ca970db3476d1318117e075652be3dc54a17ff4c6dcbea91b69ba6a809d
Deleted: sha256:af1d7e3142536074b5e92da4f36546c65c316d645f471bb64bcf2023b56d6351
Deleted: sha256:394935694891be25b97736cfa134e8122d07066e7158e6d987430f3680a444b4
Deleted: sha256:26496153c8d40746d3ad5ff311ca6020ea4db804a87a59b2f94958136c12f1be
Deleted: sha256:f008f4d0af97cd55a2f3e41031a1a25c8b32795803aefeef41bd7aa20ab38867
Deleted: sha256:6f929a762cf92dcaaa08e1a66e1f937f1eff277f7bea37497364820d8e613c1a
Deleted: sha256:b114444ee7535e21dec502a8781fa7fb1b96ad41d139c4aa142398911e2f12b1
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211018124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4df8ffb0b0f0f953f605731eb08a71de653af8fe1b3e9b93e4f42638fc4a24ba]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211018124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4df8ffb0b0f0f953f605731eb08a71de653af8fe1b3e9b93e4f42638fc4a24ba])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4df8ffb0b0f0f953f605731eb08a71de653af8fe1b3e9b93e4f42638fc4a24ba
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4df8ffb0b0f0f953f605731eb08a71de653af8fe1b3e9b93e4f42638fc4a24ba
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4df8ffb0b0f0f953f605731eb08a71de653af8fe1b3e9b93e4f42638fc4a24ba].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 40s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p7s2jbh462rrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #122

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/122/display/redirect>

Changes:


------------------------------------------
[...truncated 48.52 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
0d03cb6de735: Preparing
373cfc536b92: Preparing
d7d59b119c13: Preparing
08731deb5899: Preparing
6f863fa754a0: Preparing
735ad4aee5b9: Preparing
57c28a12ac5a: Preparing
66b571650cb9: Preparing
ef13a8f3645d: Preparing
42f9c59d4994: Preparing
be05c80060cb: Preparing
e63be041c4ce: Preparing
211736d7fc9d: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
735ad4aee5b9: Waiting
0b3c02b5d746: Preparing
66b571650cb9: Waiting
62a747bf1719: Preparing
57c28a12ac5a: Waiting
42f9c59d4994: Waiting
ef13a8f3645d: Waiting
be05c80060cb: Waiting
e63be041c4ce: Waiting
211736d7fc9d: Waiting
ba6e5ff31f23: Waiting
36e0782f1159: Waiting
0b3c02b5d746: Waiting
62a747bf1719: Waiting
62a5b8741e83: Waiting
d7d59b119c13: Pushed
6f863fa754a0: Pushed
373cfc536b92: Pushed
735ad4aee5b9: Pushed
0d03cb6de735: Pushed
08731deb5899: Pushed
66b571650cb9: Pushed
ef13a8f3645d: Pushed
211736d7fc9d: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
be05c80060cb: Pushed
ba6e5ff31f23: Layer already exists
57c28a12ac5a: Pushed
9f9f651e9303: Layer already exists
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
e63be041c4ce: Pushed
42f9c59d4994: Pushed
20211017124332: digest: sha256:2179fd19be59fd1b3625061f91acd7781b377b4b53fa39b9c539dfee16c5ecb9 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 17, 2021 12:45:21 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 17, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 17, 2021 12:45:22 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 17, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 17, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 5e28b9a098b764a4bcbb22a3f47c2d450db954c77953aa6fead07d9d8106cb9c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xii5oJi3ZKS8uyKj9HwtRQ25VMd5U6pv6tB9nYEGy5w.pb
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 17, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@aaa0f76]
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 17, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03]
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 17, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-17_05_45_28-12751250368988367149?project=apache-beam-testing
Oct 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-17_05_45_28-12751250368988367149
Oct 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-17_05_45_28-12751250368988367149
Oct 17, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-17T12:45:33.331Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-1npm. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:36.516Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.321Z: Expanding SplittableParDo operations into optimizable parts.
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.347Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.402Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.476Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.505Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.559Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.652Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.687Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.717Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.751Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.782Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.799Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.830Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.866Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.899Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.926Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.956Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:37.981Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.003Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.034Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.056Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.089Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.120Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.155Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.189Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.217Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.244Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.278Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 17, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.321Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 17, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:38.691Z: Starting 5 ****s in us-central1-a...
Oct 17, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:45:54.570Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 17, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:46:22.041Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 17, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:47:16.455Z: Workers have started successfully.
Oct 17, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T12:47:16.484Z: Workers have started successfully.
Oct 17, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T16:00:26.861Z: Cancel request is committed for workflow job: 2021-10-17_05_45_28-12751250368988367149.
Oct 17, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T16:00:26.939Z: Cleaning up.
Oct 17, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T16:00:27.104Z: Stopping **** pool...
Oct 17, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T16:00:27.163Z: Stopping **** pool...
Oct 17, 2021 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T16:03:06.035Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 17, 2021 4:03:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-17T16:03:06.072Z: Worker pool stopped.
Oct 17, 2021 4:03:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-17_05_45_28-12751250368988367149 finished with status CANCELLED.
Load test results for test (ID): 31c80f47-4469-4607-afe5-9082afe97d61 and timestamp: 2021-10-17T12:45:22.254000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11539.799
dataflow_v2_java11_total_bytes_count             2.36353573E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211017124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2179fd19be59fd1b3625061f91acd7781b377b4b53fa39b9c539dfee16c5ecb9
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211017124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2179fd19be59fd1b3625061f91acd7781b377b4b53fa39b9c539dfee16c5ecb9]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211017124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2179fd19be59fd1b3625061f91acd7781b377b4b53fa39b9c539dfee16c5ecb9])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2179fd19be59fd1b3625061f91acd7781b377b4b53fa39b9c539dfee16c5ecb9
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2179fd19be59fd1b3625061f91acd7781b377b4b53fa39b9c539dfee16c5ecb9
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2179fd19be59fd1b3625061f91acd7781b377b4b53fa39b9c539dfee16c5ecb9].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 57s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sd4pab4htlagc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #121

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/121/display/redirect?page=changes>

Changes:

[noreply] Corrected Join Example

[noreply] Minor: Add more links to DataFrame API documentation (#15661)

[noreply] [BEAM-11480] Use snippets for DataFrame examples (#15600)

[noreply] Allow multiple Python worker processe to share the same VM. (#15642)

[noreply] [BEAM-12564] Implement Series.hasnans (#15729)

[noreply] Minor: Fix frames_test.py equality check for non-frame outputs (#15734)

[noreply] [BEAM-12769] Adds integration tests for Java Class Lookup based

[Brian Hulette] Address BEAM-4028 cleanup TODOs

[Brian Hulette] Bump Dataflow container to beam-master-20211015

[noreply] [BEAM-13052] Restructure pubsublite folder to move non-user interface


------------------------------------------
[...truncated 58.09 KB...]
INFO: Adding Ungroup and reiterate as step s14
Oct 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 16, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-16_05_45_49-17323798727383694038?project=apache-beam-testing
Oct 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-16_05_45_49-17323798727383694038
Oct 16, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-16_05_45_49-17323798727383694038
Oct 16, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T12:45:54.539Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-pft5. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 16, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:45:59.700Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.574Z: Expanding SplittableParDo operations into optimizable parts.
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.622Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.694Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.765Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.793Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.851Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.952Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:00.987Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.019Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.052Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.086Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.110Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.143Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.174Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.209Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.244Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.275Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.318Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.351Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.391Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.427Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.464Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.492Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.518Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.546Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.602Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.629Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.673Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:01.714Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 16, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:02.125Z: Starting 5 ****s in us-central1-a...
Oct 16, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:27.138Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 16, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:35.807Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 16, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:35.837Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Oct 16, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:46:46.060Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 16, 2021 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:47:40.119Z: Workers have started successfully.
Oct 16, 2021 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T12:47:40.153Z: Workers have started successfully.
Oct 16, 2021 1:19:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:19:03.749Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Oct 16, 2021 1:19:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:19:04.603Z: Staged package google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar' is inaccessible.
Oct 16, 2021 1:19:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:19:05.723Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Oct 16, 2021 1:19:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:19:06.050Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Oct 16, 2021 1:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:19:06.698Z: Staged package threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar' is inaccessible.
Oct 16, 2021 1:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:19:06.811Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:22:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:22:06.787Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:25:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:25:03.744Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Oct 16, 2021 1:25:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:25:04.634Z: Staged package google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar' is inaccessible.
Oct 16, 2021 1:25:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:25:05.897Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Oct 16, 2021 1:25:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:25:06.215Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Oct 16, 2021 1:25:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:25:06.753Z: Staged package threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar' is inaccessible.
Oct 16, 2021 1:25:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:25:06.889Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:28:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:28:06.865Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:31:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:31:03.590Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Oct 16, 2021 1:31:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:31:04.668Z: Staged package google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar' is inaccessible.
Oct 16, 2021 1:31:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:31:05.756Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Oct 16, 2021 1:31:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:31:06.063Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Oct 16, 2021 1:31:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:31:06.693Z: Staged package threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar' is inaccessible.
Oct 16, 2021 1:31:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:31:06.797Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:34:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:34:07.023Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:37:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:37:03.471Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Oct 16, 2021 1:37:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:37:04.262Z: Staged package google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar' is inaccessible.
Oct 16, 2021 1:37:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:37:05.367Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Oct 16, 2021 1:37:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:37:05.671Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Oct 16, 2021 1:37:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:37:06.252Z: Staged package threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar' is inaccessible.
Oct 16, 2021 1:37:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:37:06.366Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:40:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:40:06.274Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:43:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:43:03.533Z: Staged package failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/failureaccess-1.0.1-oXHuTHNN0tqDfksWvp30Zhr6typBra8x64Tf2vk2yiY.jar' is inaccessible.
Oct 16, 2021 1:43:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:43:04.274Z: Staged package google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-oauth-client-1.31.0-9fiaR9DCEOJ3VPxypLVvfxhohPwMB0pyoLY-Gc52QtA.jar' is inaccessible.
Oct 16, 2021 1:43:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:43:05.544Z: Staged package listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava-s3KgN9QjCqV_vv_e8w_WEj-cDC24XQrO0AyRuXTzP5k.jar' is inaccessible.
Oct 16, 2021 1:43:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:43:05.963Z: Staged package netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-tcnative-boringssl-static-2.0.33.Final-q3CryMTeke7zEAdYTla2hvWe2cxEwrM38_6C8EAtOH0.jar' is inaccessible.
Oct 16, 2021 1:43:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-16T13:43:06.816Z: Staged package threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/threetenbp-1.5.1-Q0LuBNhwQPcbCqkYjulgeA7y2nNOMqjUOlIqWAteDzs.jar' is inaccessible.
Oct 16, 2021 1:43:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:43:06.953Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 1:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-16T13:46:06.252Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 16, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T16:00:27.167Z: Cancel request is committed for workflow job: 2021-10-16_05_45_49-17323798727383694038.
Oct 16, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T16:00:27.227Z: Cleaning up.
Oct 16, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T16:00:27.374Z: Stopping **** pool...
Oct 16, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T16:00:27.445Z: Stopping **** pool...
Oct 16, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T16:02:49.591Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 16, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-16T16:02:49.638Z: Worker pool stopped.
Oct 16, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-16_05_45_49-17323798727383694038 finished with status CANCELLED.
Load test results for test (ID): 13df6ea1-35df-4668-b89d-87eb2f85798e and timestamp: 2021-10-16T12:45:43.518000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11528.11
dataflow_v2_java11_total_bytes_count             2.56458986E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211016124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1139a919877fc8a186c189c0b8d5e7658db9db5dace1af8b78d16323f55be1e6
Deleted: sha256:c6608b6371c4637110e361d876b767919e20107a3d6e982557fca038825b21a0
Deleted: sha256:a5692feace293c810ad0f53d96f307288c301534a4aa63c8aa8c3e14959b9233
Deleted: sha256:13d26b6f316d4a433baa86d59e4a468137c6dabf5fb5788bfebe01382496b190
Deleted: sha256:3c9134dbc80125648493f303549a6f88fa321d85e497fd652861b9a2551318af
Deleted: sha256:5bdb43ffdccd3275cb844aed35217d28fe43b54bda72c14edc31d8a958a1339c
Deleted: sha256:f0137e512f65a6f8b679f5ea8ec98cbc22c94ef0ec57d0b5f2f201279ccedadf
Deleted: sha256:efdad8ce1b34f47f2e737402df6a80b86d0a188ded817b79b30ef2483c026a93
Deleted: sha256:c301f0738a9ac93cca8d154a89b7234f75151600b9f76ca734f1c031e090e34b
Deleted: sha256:6f115d7dcccc1934213dade5be75ec199935f6c01c09ad418bc9d59f8e21dbc5
Deleted: sha256:3393a42e8a1ee7e56d2a5158862df7dcef6e80493b0aa8b48ae72cc7678f3228
Deleted: sha256:f711be6ab86448ce20c8351c0848aab43f265948989121a7369e53f559edc729
Deleted: sha256:58ff9fb58d37de28f1cba77f31b8431e90df3290df2818e2f1af0bb887bdf192
Deleted: sha256:2e00d8c9ca6557b4be89e4e146a2b127eae08fbd1628653b6c8468062f06c001
Deleted: sha256:46804d896e74d4f1e95531875be0142af5291abe916f04064b27dac263839cd8
Deleted: sha256:34b4caed1a3bdf5be866dbabcd9ca98f7b38033f0dd995de6eb3c622e711672b
Deleted: sha256:5908e43a92de689066a54b33fb14e1ad226b96fc554bd164c9a6914c3c619db2
Deleted: sha256:eb932af945883c033fa3fc6b93da740b9b77596e157c423e84a78ba36dbecf62
Deleted: sha256:04ecf2be4164234ff927925174181832e863583b836dbd98647f2958f38f3522
Deleted: sha256:403ab567820fe7129b2d117c8663e5b852e341ff5003ad5d83777384cb9b3abe
Deleted: sha256:ddb1bf0bbbbd3f35f354ff392659720ddfd292f1ac80c03931ef3baa1bcaebea
Deleted: sha256:8f68508c4657b06b94a02853e848fef202bf2b8ba3eefaaeb7fb5a330743675e
Deleted: sha256:2a05e65b8c777c3462ad9574123dc3cd316856740ea67ab92e27d3343bbce2e6
Deleted: sha256:c6021de9fbb391c81380278cf46d13fd09c05247b350549e989deae8b1bad323
Deleted: sha256:2c3032d4a45f19e4e9ab451c4def71962133f19904e230b250acb781fea9366b
Deleted: sha256:36691e24c1412315338563b813baa922710ec6e4d385ffe3290c7ce3ab161032
Deleted: sha256:f84fd20473a63ee5f8b1db36aec0cd0987006d32729bd52fe96cdb39bd821fc7
Deleted: sha256:964ec6d0fd593c23dc6dc382addf4d87f9c770583321e430facdde78a8985b3a
Deleted: sha256:6ab16c16fe924bef92b502027c8594854970a7e7bb7e6bf5454e73504eaf43c6
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211016124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1139a919877fc8a186c189c0b8d5e7658db9db5dace1af8b78d16323f55be1e6]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211016124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1139a919877fc8a186c189c0b8d5e7658db9db5dace1af8b78d16323f55be1e6])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1139a919877fc8a186c189c0b8d5e7658db9db5dace1af8b78d16323f55be1e6
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1139a919877fc8a186c189c0b8d5e7658db9db5dace1af8b78d16323f55be1e6
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1139a919877fc8a186c189c0b8d5e7658db9db5dace1af8b78d16323f55be1e6].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 38s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/khlinekl22gz2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #120

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/120/display/redirect?page=changes>

Changes:

[ilya.kozyrev] Add java executor to compile java code in created file system

[zyichi] Use daemon thread to report lull operation and move code to

[noreply] Merge pull request #15722 from [BEAM-13022] [Playground] Run code on the

[Luke Cwik] [BEAM-12640] Skip checkerframework on generated code/test libraries for

[Luke Cwik] [BEAM-13015] Add tiny bundle processing benchmark for Java SDK harness.

[noreply] [BEAM-12922] Test MapState putIfAbsent with no read (#15704)

[Robert Bradshaw] [BEAM-13053] Avoid runner v2 when streaming engine explicitly disabled.

[noreply] [Go SDK] Small typo fixes. (#15725)

[noreply] [BEAM-13038] Migrate to use a context object within

[noreply] [BEAM-12135] Avoid windowing overhead in Spark global GBK. (#15637)

[noreply] [BEAM-12907] Run DataFrame API tests with multiple pandas versions

[vachan] Remove unnecessary ERROR logging.


------------------------------------------
[...truncated 49.96 KB...]
9f9f651e9303: Waiting
36e0782f1159: Waiting
0b3c02b5d746: Waiting
211736d7fc9d: Waiting
62a747bf1719: Waiting
627ba6eec1f0: Pushed
72c54bb5cb63: Pushed
46d865f218ac: Pushed
95547c4bcd85: Pushed
7f18771badf7: Pushed
67b322ae068a: Pushed
ac04a64b7850: Pushed
e36537bc2320: Pushed
211736d7fc9d: Layer already exists
1fbcbef2a719: Pushed
62a5b8741e83: Layer already exists
fc415c7af1e8: Pushed
36e0782f1159: Layer already exists
0b3c02b5d746: Layer already exists
ba6e5ff31f23: Layer already exists
62a747bf1719: Layer already exists
9f9f651e9303: Layer already exists
db7df9806082: Pushed
65cd9fd5b8f1: Pushed
20211015124339: digest: sha256:37492b5808443f2d5949274093061710a47346c5d4c3f6de22a40f37b8862c95 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 15, 2021 12:46:01 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 15, 2021 12:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 15, 2021 12:46:03 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 15, 2021 12:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 15, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 15, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 15, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 15, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash e8c6a57bd9cef213102fb818e3b4321873673d1af67d4258a2236e133f392a4e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6Male9nO8hMQL7gY47QyGHNnPRr2fUJYoiNuEz85Kk4.pb
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 15, 2021 12:46:07 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 15, 2021 12:46:07 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 15, 2021 12:46:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 15, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-15_05_46_07-5099732834723193030?project=apache-beam-testing
Oct 15, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-15_05_46_07-5099732834723193030
Oct 15, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-15_05_46_07-5099732834723193030
Oct 15, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-15T12:46:14.163Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-gywb. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:16.904Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:17.643Z: Expanding SplittableParDo operations into optimizable parts.
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:17.677Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:17.743Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:17.810Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:17.840Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:17.889Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:17.989Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.018Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.039Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.065Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.087Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.115Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.137Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.160Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.195Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.222Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.248Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.275Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.313Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.348Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.369Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.398Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.415Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.438Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.468Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.501Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.528Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.559Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 15, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.585Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 15, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:18.901Z: Starting 5 ****s in us-central1-a...
Oct 15, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:46:26.979Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 15, 2021 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:47:05.880Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 15, 2021 12:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:48:07.024Z: Workers have started successfully.
Oct 15, 2021 12:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T12:48:07.051Z: Workers have started successfully.
Oct 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T16:00:33.261Z: Cancel request is committed for workflow job: 2021-10-15_05_46_07-5099732834723193030.
Oct 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T16:00:33.299Z: Cleaning up.
Oct 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T16:00:33.392Z: Stopping **** pool...
Oct 15, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T16:00:33.465Z: Stopping **** pool...
Oct 15, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T16:02:52.630Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 15, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-15T16:02:52.689Z: Worker pool stopped.
Oct 15, 2021 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-15_05_46_07-5099732834723193030 finished with status CANCELLED.
Load test results for test (ID): 6596d9b1-0af1-4cd3-b7c0-db94f0667212 and timestamp: 2021-10-15T12:46:02.621000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11526.378
dataflow_v2_java11_total_bytes_count             3.17600769E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211015124339
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37492b5808443f2d5949274093061710a47346c5d4c3f6de22a40f37b8862c95
Deleted: sha256:b0c907852126d6cb7a13515ede17d8d4d200012b8683a1647db3cc0b08bb9ae0
Deleted: sha256:27c5f14935511d28286684c8599aeebf7a6a453e7b52b43d32f38f7aa1d5ad1f
Deleted: sha256:176290c3fe5259eba93d5debe5dff5a464e35a78b2a0fa4c0ff6ddfb0edf709b
Deleted: sha256:0f0f1c4e7e253e139b8a2a47f13fb876660de84016503577d7cacac021f192bb
Deleted: sha256:3ee6217474a1e24a3edb08089bfa4bdb19b2603dd7808671a3da3a6749d1be1b
Deleted: sha256:5d12a97bd9a097c769513067f40c0a9bd3f4a2f09709645d30e1d0757e90dc18
Deleted: sha256:448f9ff4615e854aed0135bec8512f1c639bfbd8a439bd6aebac2297f4f3a913
Deleted: sha256:087553de6f51ae15ef3c64ce8a226e16857f54045fc95b754c9f4ce444c746e7
Deleted: sha256:00ace44f9bc60221ab49cc4b2101f25d06657b9dc9b6c54ba6489b278a36ae2e
Deleted: sha256:0693fbf5155d16f83c68601c0eec72707f4dd62c6a5c95dc695c3732fe58ddfe
Deleted: sha256:d3fb5c1969a27379ad47c7ba1dd038e5cd99aaff4a73a4e7ce011f5654102ce6
Deleted: sha256:90ee2b715475d8d31392e671085f13e7c36ff4c7538a0e9651768e5c32611ffb
Deleted: sha256:03ee277a07cd025a21936a629389a83a8ff2ff68e5dcfd531673961d4b23e96c
Deleted: sha256:a83ca80aff119c02d1de27d14bc220a739f0f0fa19aa6f25b468e5d847a56c86
Deleted: sha256:d5c6e403906f542d4b7ad462d4cccd63917e5612223f57d2276aecd002894bd3
Deleted: sha256:3f19505a498a646b7d1d15e58984c1e7c0b3b6fde52083e303bdc701d754dab9
Deleted: sha256:aeeed2b4a1581948ac94bead7a2dc31129c1cdf5df4cb142a98e0256db3023eb
Deleted: sha256:89c77848d2b63b6111f5eff2cf81c79b136a851327484de38a704c5bf56376c4
Deleted: sha256:4e14f9fd4348fb8ebeb2dcd17ea6e2a3b18996717a0246a4acc0197d994fc908
Deleted: sha256:fe4650ab5ddacfc3f154e00efab3eecfa700adaf41f7c578e6c942a1ca411a99
Deleted: sha256:e65af55e3b3ee008f31b3bedcce5fdf077a8ac904ae5072ba28c93b7a079d4cb
Deleted: sha256:03c0f63d9d98017f31ced6b135d7f8f6bd995f1b262d81474692d716d5e6862c
Deleted: sha256:642977761a622ce0819e1ad2eba7438a70f165cbf6dbb68d06dbdd0ec6e9b1fa
Deleted: sha256:1099a3ca149d5cf61a79d6fd5fb97a6ebd55af680a9462ff70277a067f93a47c
Deleted: sha256:0ed7259d6bb12aa03171b77f0abb5babb7689b81bd68ff2bef28a07e8b0fa52b
Deleted: sha256:670aa8b914db5b5779af2e27d3b2b767d13e71f5e94d6ce0f83b75f8cbd325a1
Deleted: sha256:bc06fa176310809f50267375306769dc6e06103a412a904c0162d801f6442c84
Deleted: sha256:88d4212a8fef6fd7d7cc9b6b044e71085d3041f49756a79ca576ccc2e65bc088
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211015124339]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37492b5808443f2d5949274093061710a47346c5d4c3f6de22a40f37b8862c95]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211015124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37492b5808443f2d5949274093061710a47346c5d4c3f6de22a40f37b8862c95])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37492b5808443f2d5949274093061710a47346c5d4c3f6de22a40f37b8862c95
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37492b5808443f2d5949274093061710a47346c5d4c3f6de22a40f37b8862c95
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:37492b5808443f2d5949274093061710a47346c5d4c3f6de22a40f37b8862c95].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 40s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ss7fflnnu6e7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #119

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/119/display/redirect?page=changes>

Changes:

[Udi Meiri] Release guide: minor updates

[dmitrii_kuzin] [BEAM-12730] Custom delimiter always bytes

[noreply] Minor: Add base_func to Series.aggregate expression names (#15706)

[noreply] [BEAM-13013] Extend cross language expansion. (#15698)

[noreply] Minor: Clarify confusing boolean logic for selecting PerWindowInvoker

[noreply] [BEAM-4424] Execute hooks in enabling order (#15713)

[noreply] [BEAM-11097] Update cache with metrics counting, windowing (#15717)

[kawaigin] Prepare release of apache-beam-jupyterlab-sidepanel v2.0.0

[noreply] [BEAM-12644] Javascript files moved to assets (#15653)


------------------------------------------
[...truncated 49.23 KB...]
fc92b19a474c: Preparing
a4201d4818e2: Preparing
37328c6b84db: Preparing
305bb936906e: Preparing
c1b02b399900: Preparing
fa030dac2f06: Preparing
f5ac1c8f92d2: Preparing
211736d7fc9d: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
a4201d4818e2: Waiting
305bb936906e: Waiting
211736d7fc9d: Waiting
c1b02b399900: Waiting
37328c6b84db: Waiting
62a5b8741e83: Waiting
fa030dac2f06: Waiting
f5ac1c8f92d2: Waiting
9f9f651e9303: Waiting
0b3c02b5d746: Waiting
ba6e5ff31f23: Waiting
62a747bf1719: Waiting
36e0782f1159: Waiting
b3998852e35b: Pushed
bfcfb0b3af60: Pushed
19115141af1a: Pushed
fc92b19a474c: Pushed
1a70880f1ea8: Pushed
2c38276686a8: Pushed
37328c6b84db: Pushed
305bb936906e: Pushed
211736d7fc9d: Layer already exists
fa030dac2f06: Pushed
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
f5ac1c8f92d2: Pushed
a4201d4818e2: Pushed
ba6e5ff31f23: Layer already exists
0b3c02b5d746: Layer already exists
9f9f651e9303: Layer already exists
62a747bf1719: Layer already exists
c1b02b399900: Pushed
20211014124331: digest: sha256:0010396bb221b6eb520a8658a9ed24f2b07aa8fb9e7f2087001876e1ea010d23 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 14, 2021 12:45:16 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 14, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 14, 2021 12:45:17 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 14, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 14, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 14, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 14, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 14, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 9950a4903cb02103c09974ea3d237e9d1a1625294d1944271b38f71744b1fc5d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mVCkkDywIQPAmXTqPSN-nRoWJSlNGUQnGzj3F0Sx_F0.pb
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 14, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 14, 2021 12:45:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 14, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 14, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-14_05_45_22-2948433723216611995?project=apache-beam-testing
Oct 14, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-14_05_45_22-2948433723216611995
Oct 14, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-14_05_45_22-2948433723216611995
Oct 14, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-14T12:45:28.808Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-xsnp. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:44.096Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:44.767Z: Expanding SplittableParDo operations into optimizable parts.
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:44.790Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:44.848Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:44.918Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:44.947Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.012Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.101Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.135Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.171Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.197Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.230Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.255Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.285Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.310Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.334Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.375Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.408Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.444Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.472Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.493Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.518Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.563Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.595Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.629Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.664Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.698Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.730Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.775Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:45.798Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 14, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:46.179Z: Starting 5 ****s in us-central1-a...
Oct 14, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:45:59.019Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 14, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:46:33.082Z: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 14, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:46:33.109Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
Oct 14, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:47:03.810Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 14, 2021 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:47:31.701Z: Workers have started successfully.
Oct 14, 2021 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T12:47:31.737Z: Workers have started successfully.
Oct 14, 2021 1:41:30 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Oct 14, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T16:00:26.932Z: Cancel request is committed for workflow job: 2021-10-14_05_45_22-2948433723216611995.
Oct 14, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T16:00:26.980Z: Cleaning up.
Oct 14, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T16:00:27.063Z: Stopping **** pool...
Oct 14, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T16:00:27.109Z: Stopping **** pool...
Oct 14, 2021 4:02:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T16:02:44.617Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 14, 2021 4:02:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-14T16:02:44.694Z: Worker pool stopped.
Oct 14, 2021 4:02:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-14_05_45_22-2948433723216611995 finished with status CANCELLED.
Load test results for test (ID): 9c9961b7-6df9-4864-bdf7-46431390b3bb and timestamp: 2021-10-14T12:45:17.325000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11493.503
dataflow_v2_java11_total_bytes_count              3.0893436E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211014124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0010396bb221b6eb520a8658a9ed24f2b07aa8fb9e7f2087001876e1ea010d23
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211014124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0010396bb221b6eb520a8658a9ed24f2b07aa8fb9e7f2087001876e1ea010d23]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211014124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0010396bb221b6eb520a8658a9ed24f2b07aa8fb9e7f2087001876e1ea010d23])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0010396bb221b6eb520a8658a9ed24f2b07aa8fb9e7f2087001876e1ea010d23
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0010396bb221b6eb520a8658a9ed24f2b07aa8fb9e7f2087001876e1ea010d23
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0010396bb221b6eb520a8658a9ed24f2b07aa8fb9e7f2087001876e1ea010d23].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 37s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/32peok6pgqug4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #118

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/118/display/redirect?page=changes>

Changes:

[cgray] [BEAM-7169] Fix singleton assertion in PAssert javadoc.

[noreply] Revert "[BEAM-10913] - Forcing update of YAML file by running kubectl

[heejong] Update Dataflow Python container tag

[noreply] Merge pull request #15667 from [BEAM-12730] Add custom delimiters to

[noreply] Merge pull request #15700 from [BEAM-12987] [Playground] Add page UI

[kawaigin] [BEAM-12997] Upgraded labextension to V3

[noreply] [BEAM-12513] Add JIRAs for missing Go SDK features (#15658)


------------------------------------------
[...truncated 49.00 KB...]
48e502b38c40: Preparing
9fe49eb8f698: Preparing
21bba6fc7db0: Preparing
04e5cba339ad: Preparing
8bfabf5d869b: Preparing
020514861326: Preparing
365d9dfe3001: Preparing
2329b590aa4e: Preparing
527eb80015ed: Preparing
f505a9e8cbc6: Preparing
f04d52741e62: Preparing
341f0e6e941d: Preparing
211736d7fc9d: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
211736d7fc9d: Waiting
62a5b8741e83: Waiting
365d9dfe3001: Waiting
36e0782f1159: Waiting
9f9f651e9303: Waiting
ba6e5ff31f23: Waiting
0b3c02b5d746: Waiting
2329b590aa4e: Waiting
62a747bf1719: Waiting
527eb80015ed: Waiting
020514861326: Waiting
f505a9e8cbc6: Waiting
341f0e6e941d: Waiting
f04d52741e62: Waiting
21bba6fc7db0: Pushed
8bfabf5d869b: Pushed
9fe49eb8f698: Pushed
48e502b38c40: Pushed
020514861326: Pushed
04e5cba339ad: Pushed
2329b590aa4e: Pushed
527eb80015ed: Pushed
211736d7fc9d: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
365d9dfe3001: Pushed
9f9f651e9303: Layer already exists
ba6e5ff31f23: Layer already exists
341f0e6e941d: Pushed
0b3c02b5d746: Layer already exists
62a747bf1719: Layer already exists
f04d52741e62: Pushed
f505a9e8cbc6: Pushed
20211013124333: digest: sha256:40f67b11010e01baf38404e6a41e5f9346ec0a2575d61edf1310938d4ed891ad size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 13, 2021 12:46:26 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 13, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 13, 2021 12:46:27 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 13, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 13, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 13, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 13, 2021 12:46:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 13, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 8ecf4624414404a248a31cca3f8be5d470069840d823c09bb168c1eca9625d45> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-js9GJEFEBKJIoxzKP4vl1HAGmEDYI8CbsWjB7KliXUU.pb
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 13, 2021 12:46:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@aaa0f76, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@340a8894]
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 13, 2021 12:46:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9]
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 13, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 13, 2021 12:46:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-13_05_46_33-9020637892299962970?project=apache-beam-testing
Oct 13, 2021 12:46:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-13_05_46_33-9020637892299962970
Oct 13, 2021 12:46:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-13_05_46_33-9020637892299962970
Oct 13, 2021 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-13T12:46:49.979Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-yw0n. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:53.964Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:54.661Z: Expanding SplittableParDo operations into optimizable parts.
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:54.695Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:54.753Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:54.810Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:54.840Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:54.902Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:54.992Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.030Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.063Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.095Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.116Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.150Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.182Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.216Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 13, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.239Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.268Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.302Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.348Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.382Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.414Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.444Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.477Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.510Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.533Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.567Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.600Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.624Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.658Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:55.691Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 13, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:46:56.061Z: Starting 5 ****s in us-central1-a...
Oct 13, 2021 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:47:23.543Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 13, 2021 12:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:47:37.187Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 13, 2021 12:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:48:24.993Z: Workers have started successfully.
Oct 13, 2021 12:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T12:48:25.029Z: Workers have started successfully.
Oct 13, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T16:00:38.767Z: Cancel request is committed for workflow job: 2021-10-13_05_46_33-9020637892299962970.
Oct 13, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T16:00:38.895Z: Cleaning up.
Oct 13, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T16:00:38.992Z: Stopping **** pool...
Oct 13, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T16:00:39.058Z: Stopping **** pool...
Oct 13, 2021 4:03:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T16:03:03.417Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 13, 2021 4:03:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-13T16:03:03.462Z: Worker pool stopped.
Oct 13, 2021 4:03:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-13_05_46_33-9020637892299962970 finished with status CANCELLED.
Load test results for test (ID): 31b7261a-0453-43da-8705-e569b2005f40 and timestamp: 2021-10-13T12:46:27.241000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11449.456
dataflow_v2_java11_total_bytes_count             3.20521575E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211013124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:40f67b11010e01baf38404e6a41e5f9346ec0a2575d61edf1310938d4ed891ad
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211013124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:40f67b11010e01baf38404e6a41e5f9346ec0a2575d61edf1310938d4ed891ad]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211013124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:40f67b11010e01baf38404e6a41e5f9346ec0a2575d61edf1310938d4ed891ad])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:40f67b11010e01baf38404e6a41e5f9346ec0a2575d61edf1310938d4ed891ad
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:40f67b11010e01baf38404e6a41e5f9346ec0a2575d61edf1310938d4ed891ad
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:40f67b11010e01baf38404e6a41e5f9346ec0a2575d61edf1310938d4ed891ad].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 56s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kaj6iehb6j4xq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #117

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/117/display/redirect?page=changes>

Changes:

[rogan.o.morrow] [BEAM-12875] Register file systems in SparkExecutableStageFunction

[rogelio.hernandez] [BEAM-12371][BEAM-12373] Fixed page navigation

[Kyle Weaver] [BEAM-12694] Include datetime in dicom test dataset name.

[noreply] [BEAM-12960] Add extra context for structural DoFn ProcessElement error

[noreply] Release 2.33 documentation update (#15543)

[noreply] Merge pull request #15441 from [BEAM-8823] Make FnApiRunner work by


------------------------------------------
[...truncated 49.76 KB...]
d35dc7f4c79e: Waiting
d30f6a731c80: Waiting
3054497613e6: Waiting
1056964ed652: Waiting
c52aab93b4d8: Waiting
9e2d103a3ffa: Waiting
6aac433ceedb: Pushed
49b480f412bb: Pushed
0ddd9ba0266b: Pushed
455bc4cc0c24: Pushed
de3891f9f853: Pushed
edda9796484a: Pushed
9e2d103a3ffa: Pushed
1056964ed652: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
d35dc7f4c79e: Layer already exists
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
a98514a91652: Pushed
c52aab93b4d8: Pushed
d30f6a731c80: Pushed
86cde4e619ba: Pushed
20211012124337: digest: sha256:bffcaa579e742b49fc16a642742154b49af8008e7571cc861d3a56f171aaaee0 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 12, 2021 12:45:48 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 12, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 12, 2021 12:45:49 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 12, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 12, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 12, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 12, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 12, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 3c766c7843f2aa70123fd97f34aa48ff61eceb34ca46ec3d5bf4eb1392fe52df> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PHZseEPyqnASP9l_NKpI_2Hs6zTKRuw9W_TrE5L-Ut8.pb
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 12, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@aaa0f76, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@340a8894]
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 12, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9]
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 12, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 12, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 12, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-12_05_45_54-13152940544920821177?project=apache-beam-testing
Oct 12, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-12_05_45_54-13152940544920821177
Oct 12, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-12_05_45_54-13152940544920821177
Oct 12, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-12T12:45:59.780Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-5iov. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:03.238Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:03.949Z: Expanding SplittableParDo operations into optimizable parts.
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:03.986Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.080Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.145Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.167Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.222Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.325Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.359Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.383Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.412Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.437Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.474Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 12, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.497Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.534Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.572Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.607Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.633Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.675Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.707Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.732Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.754Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.778Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.805Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.833Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.867Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.890Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.924Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.951Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:04.973Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 12, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:05.343Z: Starting 5 ****s in us-central1-a...
Oct 12, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:33.042Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 12, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:42.512Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 12, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:42.547Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Oct 12, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:46:52.851Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 12, 2021 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:47:46.501Z: Workers have started successfully.
Oct 12, 2021 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T12:47:46.530Z: Workers have started successfully.
Oct 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T16:00:26.504Z: Cancel request is committed for workflow job: 2021-10-12_05_45_54-13152940544920821177.
Oct 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T16:00:26.571Z: Cleaning up.
Oct 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T16:00:26.645Z: Stopping **** pool...
Oct 12, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T16:00:26.701Z: Stopping **** pool...
Oct 12, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T16:02:46.645Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 12, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-12T16:02:46.685Z: Worker pool stopped.
Oct 12, 2021 4:02:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-12_05_45_54-13152940544920821177 finished with status CANCELLED.
Load test results for test (ID): 0c98951e-4da4-46c4-b560-3e6e568486ff and timestamp: 2021-10-12T12:45:48.850000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11533.222
dataflow_v2_java11_total_bytes_count             2.52451079E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211012124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bffcaa579e742b49fc16a642742154b49af8008e7571cc861d3a56f171aaaee0
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211012124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bffcaa579e742b49fc16a642742154b49af8008e7571cc861d3a56f171aaaee0]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211012124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bffcaa579e742b49fc16a642742154b49af8008e7571cc861d3a56f171aaaee0])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bffcaa579e742b49fc16a642742154b49af8008e7571cc861d3a56f171aaaee0
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bffcaa579e742b49fc16a642742154b49af8008e7571cc861d3a56f171aaaee0
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:bffcaa579e742b49fc16a642742154b49af8008e7571cc861d3a56f171aaaee0].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45752f197dc2b623f8d2bfd3a4768e03f4f32a7cedfc9820a4a184bbf86542a7
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45752f197dc2b623f8d2bfd3a4768e03f4f32a7cedfc9820a4a184bbf86542a7
ERROR: (gcloud.container.images.delete) Not found: response: {'status': '404', 'content-length': '168', 'x-xss-protection': '0', 'transfer-encoding': 'chunked', 'server': 'Docker Registry', '-content-encoding': 'gzip', 'docker-distribution-api-version': 'registry/2.0', 'cache-control': 'private', 'date': 'Tue, 12 Oct 2021 16:02:56 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
Failed to compute blob liveness for manifest: 'sha256:45752f197dc2b623f8d2bfd3a4768e03f4f32a7cedfc9820a4a184bbf86542a7': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 282

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 36s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lwhfh4bgae6cy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #116

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/116/display/redirect>

Changes:


------------------------------------------
[...truncated 49.34 KB...]
f9b9b01b9d02: Preparing
3190cf4007eb: Preparing
c405272d6d12: Preparing
874ad65f91ea: Preparing
0fc2498b65e5: Preparing
d08e6b97bf21: Preparing
3054497613e6: Preparing
d35dc7f4c79e: Preparing
dabfe5b2ea81: Preparing
5e6a409f30b6: Preparing
c405272d6d12: Waiting
3054497613e6: Waiting
d35dc7f4c79e: Waiting
dabfe5b2ea81: Waiting
a1b58d8c8a0e: Waiting
dceee12a4edb: Waiting
f9b9b01b9d02: Waiting
7a742a74631e: Waiting
874ad65f91ea: Waiting
5e6a409f30b6: Waiting
a3014c377ac0: Waiting
0fc2498b65e5: Waiting
3190cf4007eb: Waiting
cebda604bcad: Pushed
001242b60b94: Pushed
4ccfeddcc41f: Pushed
a3014c377ac0: Pushed
b4346c49963e: Pushed
a1b58d8c8a0e: Pushed
ad430934355c: Pushed
7a742a74631e: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3190cf4007eb: Pushed
c405272d6d12: Pushed
3054497613e6: Layer already exists
d35dc7f4c79e: Layer already exists
dceee12a4edb: Pushed
5e6a409f30b6: Layer already exists
dabfe5b2ea81: Layer already exists
f9b9b01b9d02: Pushed
20211011124333: digest: sha256:58ec7115f0d0dc5d6c434e51effde90b875c3e160e65f28b2a217b9a4dbf7eb7 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 11, 2021 12:45:35 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 11, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 11, 2021 12:45:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 11, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 11, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 11, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 11, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 11, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash d964e1819c1ba92903d867b2e840ada497dac5244767451a89033a8f391fc3ec> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2WThgZwbqSkD2Gey6ECtpJfaxSRHZ0UaiQM6jzkfw-w.pb
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 11, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 11, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 11, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 11, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-11_05_45_41-7093437645724640072?project=apache-beam-testing
Oct 11, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-11_05_45_41-7093437645724640072
Oct 11, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-11_05_45_41-7093437645724640072
Oct 11, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-11T12:45:47.694Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-edm4. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:51.856Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.550Z: Expanding SplittableParDo operations into optimizable parts.
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.597Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.667Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.737Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.757Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.814Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.893Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.928Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.960Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:52.982Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.006Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.038Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.061Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.094Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.127Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.160Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.195Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.228Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.259Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.297Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.324Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.344Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.369Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.402Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.435Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.458Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.490Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 11, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.523Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 11, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.549Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 11, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:45:53.891Z: Starting 5 ****s in us-central1-a...
Oct 11, 2021 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:46:11.481Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 11, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:46:24.584Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 11, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:46:24.620Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Oct 11, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:46:34.857Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 11, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:47:27.903Z: Workers have started successfully.
Oct 11, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T12:47:27.933Z: Workers have started successfully.
Oct 11, 2021 3:00:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-11T15:00:56.753Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Oct 11, 2021 3:00:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-11T15:00:56.918Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Oct 11, 2021 3:00:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-11T15:00:57.764Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 11, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T16:00:30.452Z: Cancel request is committed for workflow job: 2021-10-11_05_45_41-7093437645724640072.
Oct 11, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T16:00:30.492Z: Cleaning up.
Oct 11, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T16:00:30.584Z: Stopping **** pool...
Oct 11, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T16:00:30.647Z: Stopping **** pool...
Oct 11, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T16:02:52.031Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 11, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-11T16:02:52.060Z: Worker pool stopped.
Oct 11, 2021 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-11_05_45_41-7093437645724640072 finished with status CANCELLED.
Load test results for test (ID): 19df744a-310b-41da-94ae-ba4b3de0b85a and timestamp: 2021-10-11T12:45:36.598000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11534.295
dataflow_v2_java11_total_bytes_count             2.80176771E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211011124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:58ec7115f0d0dc5d6c434e51effde90b875c3e160e65f28b2a217b9a4dbf7eb7
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211011124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:58ec7115f0d0dc5d6c434e51effde90b875c3e160e65f28b2a217b9a4dbf7eb7]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211011124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:58ec7115f0d0dc5d6c434e51effde90b875c3e160e65f28b2a217b9a4dbf7eb7])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:58ec7115f0d0dc5d6c434e51effde90b875c3e160e65f28b2a217b9a4dbf7eb7
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:58ec7115f0d0dc5d6c434e51effde90b875c3e160e65f28b2a217b9a4dbf7eb7
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:58ec7115f0d0dc5d6c434e51effde90b875c3e160e65f28b2a217b9a4dbf7eb7].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 43s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nygtfqxxlkqco

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #115

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/115/display/redirect>

Changes:


------------------------------------------
[...truncated 49.05 KB...]
fdccd057115e: Preparing
a1dfa02f6031: Preparing
26c28ed988c8: Preparing
c702fe3df9da: Preparing
c44e21fb849a: Preparing
592385105b40: Preparing
155ac4444f11: Preparing
da53b36b3009: Preparing
6f14d5c08ef2: Preparing
5d4f43e30d09: Preparing
0efe004fdc66: Preparing
b6fc84e1bca0: Preparing
874ad65f91ea: Preparing
0fc2498b65e5: Preparing
d08e6b97bf21: Preparing
3054497613e6: Preparing
d35dc7f4c79e: Preparing
dabfe5b2ea81: Preparing
5e6a409f30b6: Preparing
b6fc84e1bca0: Waiting
874ad65f91ea: Waiting
0fc2498b65e5: Waiting
592385105b40: Waiting
d08e6b97bf21: Waiting
155ac4444f11: Waiting
3054497613e6: Waiting
da53b36b3009: Waiting
d35dc7f4c79e: Waiting
6f14d5c08ef2: Waiting
dabfe5b2ea81: Waiting
5d4f43e30d09: Waiting
5e6a409f30b6: Waiting
0efe004fdc66: Waiting
c44e21fb849a: Pushed
a1dfa02f6031: Pushed
26c28ed988c8: Pushed
592385105b40: Pushed
fdccd057115e: Pushed
da53b36b3009: Pushed
c702fe3df9da: Pushed
6f14d5c08ef2: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
d35dc7f4c79e: Layer already exists
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
0efe004fdc66: Pushed
b6fc84e1bca0: Pushed
155ac4444f11: Pushed
5d4f43e30d09: Pushed
20211010124345: digest: sha256:2923c19b9c9fcb013a5ff0e6a011fca8c844b6510756f03a55b5c8979fa8172d size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 10, 2021 12:46:26 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 10, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 10, 2021 12:46:27 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 10, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 10, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 10, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 10, 2021 12:46:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 10, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 2eb408e091d7f1c72e90669c412d5fc95bc62023f5a989a2a15536a4a8f13ca0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LrQI4JHX8ccukGacQS1fyVvGICP1qYmioVU2pKjxPKA.pb
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 10, 2021 12:46:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 10, 2021 12:46:33 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 10, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 10, 2021 12:46:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-10_05_46_33-8151387498521069836?project=apache-beam-testing
Oct 10, 2021 12:46:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-10_05_46_33-8151387498521069836
Oct 10, 2021 12:46:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-10_05_46_33-8151387498521069836
Oct 10, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-10T12:46:40.829Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-ldct. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 10, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:45.091Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 10, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:45.823Z: Expanding SplittableParDo operations into optimizable parts.
Oct 10, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:45.851Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 10, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:45.920Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 10, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:45.979Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 10, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.014Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.085Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.210Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.242Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.299Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.333Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.377Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.413Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.449Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.481Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.522Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.546Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.575Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.613Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.638Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.679Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.712Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.742Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.779Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.810Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.844Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.881Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.915Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.939Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:46.971Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 10, 2021 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:46:47.330Z: Starting 5 ****s in us-central1-a...
Oct 10, 2021 12:47:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:47:20.296Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 10, 2021 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:47:32.073Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 10, 2021 12:48:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:48:26.139Z: Workers have started successfully.
Oct 10, 2021 12:48:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T12:48:26.163Z: Workers have started successfully.
Oct 10, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T16:00:27.990Z: Cancel request is committed for workflow job: 2021-10-10_05_46_33-8151387498521069836.
Oct 10, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T16:00:28.078Z: Cleaning up.
Oct 10, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T16:00:28.205Z: Stopping **** pool...
Oct 10, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T16:00:28.275Z: Stopping **** pool...
Oct 10, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T16:02:46.730Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 10, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-10T16:02:46.767Z: Worker pool stopped.
Oct 10, 2021 4:02:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-10_05_46_33-8151387498521069836 finished with status CANCELLED.
Load test results for test (ID): a8ed30b6-9ca7-4597-90ac-99aeb97b7bd0 and timestamp: 2021-10-10T12:46:27.062000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11479.373
dataflow_v2_java11_total_bytes_count             2.61106055E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211010124345
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2923c19b9c9fcb013a5ff0e6a011fca8c844b6510756f03a55b5c8979fa8172d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211010124345]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2923c19b9c9fcb013a5ff0e6a011fca8c844b6510756f03a55b5c8979fa8172d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211010124345] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2923c19b9c9fcb013a5ff0e6a011fca8c844b6510756f03a55b5c8979fa8172d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2923c19b9c9fcb013a5ff0e6a011fca8c844b6510756f03a55b5c8979fa8172d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2923c19b9c9fcb013a5ff0e6a011fca8c844b6510756f03a55b5c8979fa8172d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2923c19b9c9fcb013a5ff0e6a011fca8c844b6510756f03a55b5c8979fa8172d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 31s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jhy47c5oha5ow

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #114

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/114/display/redirect?page=changes>

Changes:

[noreply] fixing test link

[Brian Hulette] Add DataFrame changes to CHANGES.md

[stranniknm] [BEAM-12964]: add code editor component

[stranniknm] [BEAM-12966]: load initial example on page load

[stranniknm] [BEAM-12964]: fix editor scrolling

[stranniknm] [BEAM-12964]: add dark theme support

[stranniknm] [BEAM-12964]: add licence to example repository file

[stranniknm] [BEAM-12964]: extract playground page providers

[stranniknm] [BEAM-12964]: refactor styles

[noreply] fix linter issues

[Brian Hulette] Don't use run_pytest.sh for pyarrow tests

[Brian Hulette] Fail in run_pytest.sh if -m is specified

[noreply] [BEAM-10913] - Forcing update of YAML file by running kubectl apply

[noreply] [BEAM-10955] Flink Java Runner test flake: Could not find Flink job

[noreply] [BEAM-10114] Bump Pub/Sub Lite version (#15640)


------------------------------------------
[...truncated 49.40 KB...]
dee979226441: Preparing
1916c76e2494: Preparing
efa2acb5fb41: Preparing
2c95d17f6062: Preparing
42561a8a8281: Preparing
d73fc204f59e: Preparing
3f892ce166a8: Preparing
9e2f628bd2c2: Preparing
9b4008169b79: Preparing
4e60e3c97e6a: Preparing
73a376432bd2: Preparing
6008dee27547: Preparing
874ad65f91ea: Preparing
0fc2498b65e5: Preparing
d08e6b97bf21: Preparing
3054497613e6: Preparing
d35dc7f4c79e: Preparing
dabfe5b2ea81: Preparing
5e6a409f30b6: Preparing
3f892ce166a8: Waiting
9b4008169b79: Waiting
d73fc204f59e: Waiting
874ad65f91ea: Waiting
5e6a409f30b6: Waiting
73a376432bd2: Waiting
3054497613e6: Waiting
9e2f628bd2c2: Waiting
d08e6b97bf21: Waiting
6008dee27547: Waiting
4e60e3c97e6a: Waiting
d35dc7f4c79e: Waiting
1916c76e2494: Pushed
42561a8a8281: Pushed
efa2acb5fb41: Pushed
dee979226441: Pushed
d73fc204f59e: Pushed
2c95d17f6062: Pushed
9e2f628bd2c2: Pushed
9b4008169b79: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
d35dc7f4c79e: Layer already exists
73a376432bd2: Pushed
3f892ce166a8: Pushed
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
6008dee27547: Pushed
4e60e3c97e6a: Pushed
20211009124332: digest: sha256:2f81eade5b657d80bb1f34c9933206fff45543a424d81dd97f52667de5b350d4 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 09, 2021 12:45:35 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 09, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 09, 2021 12:45:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 09, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 09, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 09, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 09, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 09, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash e399f8c535506db5c907b1ed5dc6d661cc648e01c91ff19338c5963e13bca986> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-45n4xTVQbbXJB7HtXcbWYcxkjgHJH_GTOMWWPhO8qYY.pb
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 09, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 09, 2021 12:45:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 09, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 09, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-09_05_45_41-5228465986545050931?project=apache-beam-testing
Oct 09, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-09_05_45_41-5228465986545050931
Oct 09, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-09_05_45_41-5228465986545050931
Oct 09, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-09T12:45:47.493Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-g77q. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 09, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:51.663Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.405Z: Expanding SplittableParDo operations into optimizable parts.
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.456Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.520Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.587Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.627Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.689Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.808Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.836Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.869Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.890Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.922Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.958Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:52.984Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.012Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.040Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.072Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.119Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.152Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.185Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.208Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.243Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.272Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.299Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.331Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.356Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.382Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.405Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.462Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 09, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.498Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 09, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:45:53.828Z: Starting 5 ****s in us-central1-a...
Oct 09, 2021 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:46:14.986Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 09, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:46:34.769Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 09, 2021 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:47:32.218Z: Workers have started successfully.
Oct 09, 2021 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T12:47:32.253Z: Workers have started successfully.
Oct 09, 2021 1:01:05 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Oct 09, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T16:00:25.436Z: Cancel request is committed for workflow job: 2021-10-09_05_45_41-5228465986545050931.
Oct 09, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T16:00:25.470Z: Cleaning up.
Oct 09, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T16:00:25.541Z: Stopping **** pool...
Oct 09, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T16:00:25.607Z: Stopping **** pool...
Oct 09, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T16:02:46.299Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 09, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-09T16:02:46.347Z: Worker pool stopped.
Oct 09, 2021 4:02:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-09_05_45_41-5228465986545050931 finished with status CANCELLED.
Load test results for test (ID): 6d47d85d-014b-4bb2-a548-347584ac4163 and timestamp: 2021-10-09T12:45:36.194000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11519.927
dataflow_v2_java11_total_bytes_count             2.47245451E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211009124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f81eade5b657d80bb1f34c9933206fff45543a424d81dd97f52667de5b350d4
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211009124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f81eade5b657d80bb1f34c9933206fff45543a424d81dd97f52667de5b350d4]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211009124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f81eade5b657d80bb1f34c9933206fff45543a424d81dd97f52667de5b350d4])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f81eade5b657d80bb1f34c9933206fff45543a424d81dd97f52667de5b350d4
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f81eade5b657d80bb1f34c9933206fff45543a424d81dd97f52667de5b350d4
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f81eade5b657d80bb1f34c9933206fff45543a424d81dd97f52667de5b350d4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 38s
101 actionable tasks: 72 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vakcw2lsbpcw4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #113

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/113/display/redirect?page=changes>

Changes:

[Udi Meiri] Release script fixes

[noreply] [BEAM-13015] Remove the overhead of SpecMonitoringInfoValidator

[noreply] Minor: Replace generic external.py links in multi-language documentation

[noreply] Revert "[BEAM-12993] Update to Debezium 1.7.0.Final (#15636)"

[kawaigin] Updated screendiff integration test golden screenshots.

[noreply] [BEAM-12769] Few fixes related to Java Class Lookup based cross-language


------------------------------------------
[...truncated 49.01 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
cfb0e330e2cd: Preparing
350a52240e0a: Preparing
00e9ae2e2d18: Preparing
df97d3495d61: Preparing
62a494936b66: Preparing
89ac3b3d8563: Preparing
9aa27e885a7a: Preparing
99924550047b: Preparing
b40a81c38a7c: Preparing
337ad35acb76: Preparing
e885f8d0ca73: Preparing
0269687aeff1: Preparing
874ad65f91ea: Preparing
0fc2498b65e5: Preparing
d08e6b97bf21: Preparing
89ac3b3d8563: Waiting
3054497613e6: Preparing
d35dc7f4c79e: Preparing
dabfe5b2ea81: Preparing
5e6a409f30b6: Preparing
3054497613e6: Waiting
e885f8d0ca73: Waiting
99924550047b: Waiting
dabfe5b2ea81: Waiting
5e6a409f30b6: Waiting
0269687aeff1: Waiting
9aa27e885a7a: Waiting
d08e6b97bf21: Waiting
874ad65f91ea: Waiting
d35dc7f4c79e: Waiting
350a52240e0a: Pushed
62a494936b66: Pushed
00e9ae2e2d18: Pushed
89ac3b3d8563: Pushed
cfb0e330e2cd: Pushed
df97d3495d61: Pushed
99924550047b: Pushed
b40a81c38a7c: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
e885f8d0ca73: Pushed
d35dc7f4c79e: Layer already exists
0269687aeff1: Pushed
9aa27e885a7a: Pushed
5e6a409f30b6: Layer already exists
dabfe5b2ea81: Layer already exists
337ad35acb76: Pushed
20211008124332: digest: sha256:1447960b06deec03f1c40156ae68f611464231872252e290a78a8ca8b5fbc7c4 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 08, 2021 12:45:24 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 08, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 08, 2021 12:45:25 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 08, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 08, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 08, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 08, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 08, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 21965ab3c94e3331365372cf25a920dffaf948608673c1d0b0ecb2f42428204e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IZZas8lOMzE2U3LPJakg3_r5SGCGc8HQsOyy9CQoIE4.pb
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 08, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 08, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 08, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-08_05_45_29-12309857362774759606?project=apache-beam-testing
Oct 08, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-08_05_45_29-12309857362774759606
Oct 08, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-08_05_45_29-12309857362774759606
Oct 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-08T12:45:36.429Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-krt3. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 08, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:40.659Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:41.664Z: Expanding SplittableParDo operations into optimizable parts.
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:41.695Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:41.761Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:41.820Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:41.849Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:41.913Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.012Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.048Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.082Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.115Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.151Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.185Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.227Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.255Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.289Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.318Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.345Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.376Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.422Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.455Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.491Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.520Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.547Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.576Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.603Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.633Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.664Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.688Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:42.713Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 08, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:45:43.051Z: Starting 5 ****s in us-central1-a...
Oct 08, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:46:13.573Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 08, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:46:23.365Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 08, 2021 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:47:25.664Z: Workers have started successfully.
Oct 08, 2021 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T12:47:25.698Z: Workers have started successfully.
Oct 08, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T16:00:38.056Z: Cancel request is committed for workflow job: 2021-10-08_05_45_29-12309857362774759606.
Oct 08, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T16:00:38.283Z: Cleaning up.
Oct 08, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T16:00:38.366Z: Stopping **** pool...
Oct 08, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T16:00:38.439Z: Stopping **** pool...
Oct 08, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T16:03:02.106Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 08, 2021 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-08T16:03:02.149Z: Worker pool stopped.
Oct 08, 2021 4:03:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-08_05_45_29-12309857362774759606 finished with status CANCELLED.
Oct 08, 2021 4:03:09 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): 1f1e0994-1f49-4adc-9278-e1a2b31c5e51 and timestamp: 2021-10-08T12:45:24.872000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211008124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1447960b06deec03f1c40156ae68f611464231872252e290a78a8ca8b5fbc7c4
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211008124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1447960b06deec03f1c40156ae68f611464231872252e290a78a8ca8b5fbc7c4]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211008124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1447960b06deec03f1c40156ae68f611464231872252e290a78a8ca8b5fbc7c4])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1447960b06deec03f1c40156ae68f611464231872252e290a78a8ca8b5fbc7c4
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1447960b06deec03f1c40156ae68f611464231872252e290a78a8ca8b5fbc7c4
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1447960b06deec03f1c40156ae68f611464231872252e290a78a8ca8b5fbc7c4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 54s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/x42dkfqynqtlo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #112

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/112/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Restore "Default to Runner v2 for Python Streaming jobs. (#15140)"

[stefan.istrate] Fix "too many pings" errors.

[stefan.istrate] Increase keepalive timeout to 5 minutes.

[david.prieto] [BEAM-12950] Not delete orphaned files to avoid missing events

[david.prieto] [BEAM-12950] Add Bug fix description to CHANGES.md

[david.prieto] [BEAM-12950] fix linter issues

[Robert Bradshaw] Dead letter option.

[Robert Bradshaw] Guard setup.py logic with __main__ condition.

[stefan.istrate] Fix yapf complaints.

[Robert Bradshaw] Avoid incompatible setting.

[aydar.zaynutdinov] [BEAM-12969] [Playground]

[david.prieto] [BEAN-12950] Skip unit test

[noreply] [BEAM-12909][BEAM-12849]  Add support for running spark3 nexmark queries

[Robert Bradshaw] Add the ability to use subprocesses with the dead letter queue.

[Robert Bradshaw] Support multi-output DoFns.

[noreply] Merge pull request #15510 from [BEAM-12883] Add coder for

[Robert Bradshaw] multi-output fix

[Robert Bradshaw] Add thresholding to dead letter pattern.

[Robert Bradshaw] treshold test fixes

[Robert Bradshaw] Better naming, documentation.

[noreply] [BEAM-12482] Ensure that we ignore schema update options when loading

[Kyle Weaver] Moving to 2.35.0-SNAPSHOT on master branch.

[Kyle Weaver] Add 2.35.0 section to changelog.

[noreply] Fix email links in the contact page


------------------------------------------
[...truncated 48.66 KB...]
160c9ee67560: Preparing
0e3d0b3abb6a: Preparing
7b536ea6ab11: Preparing
420dd72927f0: Preparing
c3f410888c3c: Preparing
1b9057003931: Preparing
c73056b16765: Preparing
874ad65f91ea: Preparing
0fc2498b65e5: Preparing
d08e6b97bf21: Preparing
3054497613e6: Preparing
0e3d0b3abb6a: Waiting
d35dc7f4c79e: Preparing
dabfe5b2ea81: Preparing
5e6a409f30b6: Preparing
7b536ea6ab11: Waiting
160c9ee67560: Waiting
3054497613e6: Waiting
420dd72927f0: Waiting
1b9057003931: Waiting
d35dc7f4c79e: Waiting
dabfe5b2ea81: Waiting
0fc2498b65e5: Waiting
5e6a409f30b6: Waiting
c73056b16765: Waiting
874ad65f91ea: Waiting
d08e6b97bf21: Waiting
e35b5129678d: Pushed
6e384c2b1809: Pushed
6ecbd3a9033f: Pushed
160c9ee67560: Pushed
f48d33aaf847: Pushed
0a9c7e99229b: Pushed
7b536ea6ab11: Pushed
420dd72927f0: Pushed
c73056b16765: Pushed
874ad65f91ea: Layer already exists
1b9057003931: Pushed
d08e6b97bf21: Layer already exists
0fc2498b65e5: Layer already exists
3054497613e6: Layer already exists
0e3d0b3abb6a: Pushed
d35dc7f4c79e: Layer already exists
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
c3f410888c3c: Pushed
20211007124337: digest: sha256:11c0503cf08d39e4b66d2656f17cc15473fde8fe87c87f5344ef98c37c52b182 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 07, 2021 12:45:51 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 07, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 07, 2021 12:45:52 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 07, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 07, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 07, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 07, 2021 12:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 07, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash eb1d8cd27ace7e0e65013f0eced8e52f70fbf23f069c491660490d7d5b97df53> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6x2M0nrOfg5lAT8OztjlL3D78j8GnEkWYEkNfVuX31M.pb
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 07, 2021 12:45:56 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 07, 2021 12:45:56 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 07, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Oct 07, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-07_05_45_57-9477186185672990765?project=apache-beam-testing
Oct 07, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-07_05_45_57-9477186185672990765
Oct 07, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-07_05_45_57-9477186185672990765
Oct 07, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-07T12:46:04.506Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-adcz. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 07, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:10.714Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.476Z: Expanding SplittableParDo operations into optimizable parts.
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.524Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.571Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.620Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.639Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.701Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.803Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.836Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.875Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.904Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.938Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:11.970Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.005Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.030Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.049Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.078Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.107Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.129Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.164Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.189Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.241Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.277Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.314Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.374Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.596Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.671Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.717Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.754Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:12.793Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 07, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:13.175Z: Starting 5 ****s in us-central1-a...
Oct 07, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:17.671Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 07, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:54.095Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 07, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:46:54.131Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Oct 07, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:47:04.550Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 07, 2021 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:47:57.232Z: Workers have started successfully.
Oct 07, 2021 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T12:47:57.267Z: Workers have started successfully.
Oct 07, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T16:00:35.338Z: Cancel request is committed for workflow job: 2021-10-07_05_45_57-9477186185672990765.
Oct 07, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T16:00:35.411Z: Cleaning up.
Oct 07, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T16:00:35.503Z: Stopping **** pool...
Oct 07, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T16:00:35.559Z: Stopping **** pool...
Oct 07, 2021 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T16:02:56.441Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 07, 2021 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-07T16:02:56.476Z: Worker pool stopped.
Oct 07, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-07_05_45_57-9477186185672990765 finished with status CANCELLED.
Oct 07, 2021 4:03:02 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): 91d881a9-db67-46b6-87eb-f7bc4b2362f4 and timestamp: 2021-10-07T12:45:51.937000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211007124337
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11c0503cf08d39e4b66d2656f17cc15473fde8fe87c87f5344ef98c37c52b182
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211007124337]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11c0503cf08d39e4b66d2656f17cc15473fde8fe87c87f5344ef98c37c52b182]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211007124337] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11c0503cf08d39e4b66d2656f17cc15473fde8fe87c87f5344ef98c37c52b182])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11c0503cf08d39e4b66d2656f17cc15473fde8fe87c87f5344ef98c37c52b182
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11c0503cf08d39e4b66d2656f17cc15473fde8fe87c87f5344ef98c37c52b182
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:11c0503cf08d39e4b66d2656f17cc15473fde8fe87c87f5344ef98c37c52b182].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 47s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xqmtc2upalt3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #111

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/111/display/redirect?page=changes>

Changes:

[noreply] Only stage the single (fat) jar when auto-starting expansion service.

[noreply] Disable samza counters (#15659)

[noreply] Merge pull request #15614 from [BEAM-12953] [Playground] Create protobuf

[noreply] [BEAM-11831] Parially Revert "[BEAM-11805] Replace user-agent for

[kawaigin] [BEAM-10708] Enable submit beam_sql built jobs to Dataflow

[noreply] Merge pull request #15602 from [BEAM-10917] Add support for BigQuery


------------------------------------------
[...truncated 50.07 KB...]
71f5d4a7c0ce: Waiting
5e81acc91598: Pushed
094dbb8ccf85: Pushed
4c21d10adb6c: Pushed
cf18ee068f1f: Pushed
714381da582b: Pushed
7eb1bf43d45c: Pushed
bbe3d4d07eac: Pushed
61152048081e: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
f9a8961724da: Pushed
d35dc7f4c79e: Layer already exists
8bd8b261dd95: Pushed
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
71f5d4a7c0ce: Pushed
0f672dd9254e: Pushed
20211006124334: digest: sha256:e628016bfaefed79c9cff4756a17dbfc5c8f0a02b812613169cf37166ac31d39 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 06, 2021 12:45:43 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 06, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 06, 2021 12:45:44 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 06, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 06, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 06, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 06, 2021 12:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 06, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash ddbfa83e790ba31b88c31ac68c22385d9dcdf84d1287bca449e8067d47787b60> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3b-oPnkLoxuIwxrGjCI4XZ3N-E0Sh7ykSegGfUd4e2A.pb
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 06, 2021 12:45:49 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 06, 2021 12:45:49 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 06, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Oct 06, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-06_05_45_49-7294463732958114634?project=apache-beam-testing
Oct 06, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-06_05_45_49-7294463732958114634
Oct 06, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-06_05_45_49-7294463732958114634
Oct 06, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-06T12:45:56.672Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-556h. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 06, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:01.350Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 06, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.010Z: Expanding SplittableParDo operations into optimizable parts.
Oct 06, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.055Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.124Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.205Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.223Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.278Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.384Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.415Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.446Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.478Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.504Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.530Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.559Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.596Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.628Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.661Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.690Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.716Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.745Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.775Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.807Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.858Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.888Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.912Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.940Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:02.978Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:03.004Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:03.055Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:03.082Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 06, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:03.428Z: Starting 5 ****s in us-central1-a...
Oct 06, 2021 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:20.669Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 06, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:47.561Z: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 06, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:47.597Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
Oct 06, 2021 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:46:57.987Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 06, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:47:39.882Z: Workers have started successfully.
Oct 06, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T12:47:39.906Z: Workers have started successfully.
Oct 06, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T16:00:26.283Z: Cancel request is committed for workflow job: 2021-10-06_05_45_49-7294463732958114634.
Oct 06, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T16:00:26.344Z: Cleaning up.
Oct 06, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T16:00:26.423Z: Stopping **** pool...
Oct 06, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T16:00:26.475Z: Stopping **** pool...
Oct 06, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T16:02:58.434Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 06, 2021 4:03:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-06T16:02:58.478Z: Worker pool stopped.
Oct 06, 2021 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-06_05_45_49-7294463732958114634 finished with status CANCELLED.
Load test results for test (ID): 3318cbb5-e9bd-4b3d-91a2-e9d4b550ae4f and timestamp: 2021-10-06T12:45:43.844000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11527.271
dataflow_v2_java11_total_bytes_count             2.63679296E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211006124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e628016bfaefed79c9cff4756a17dbfc5c8f0a02b812613169cf37166ac31d39
Deleted: sha256:7bbf3285144113335b0065fc3ef1ec7edf73bd6f80ee93017ea17caf970dd188
Deleted: sha256:76505d323d88ad58093e61d857381448040b965316c6723d9b3325588dbe3218
Deleted: sha256:afa140d8cc6b486910b2888e64f7d79203690319b57a933122c09e25caa29810
Deleted: sha256:5303ccee4b211750d4697609d783a189cf29570274f23ae43c9f8123def30f23
Deleted: sha256:171ffe0152ee103c3c35c63fc180f991d36cd44af9b64e031673b9d44a405a30
Deleted: sha256:60b43119adb48034e4add73a695523c3dbf2f27651c0687a02fa023ab6f3f38b
Deleted: sha256:cde541cd0dc247fe4fb433fa29ac641d13b8fd4efd4fdbc8fff105d8d9d957ec
Deleted: sha256:11414c1abb8f3de6e5c0062908897c982f699c208650f36795f28010afc4f51f
Deleted: sha256:67efe9104c53640dda844ab8ec04c55b398bce4b981df267161e9eeedad2089b
Deleted: sha256:efc26505aecb7a8c53f2e28018dea9c5566ba259c25b4d1b032e3251a495d909
Deleted: sha256:ae3fbe8b708a452d5f2b22d4e98d89ab3751973c8e2359102864fad0555eaf0b
Deleted: sha256:b66da3f19303aa9c7803a4cb004f5fa30028828b68c097540639d69944c8aa73
Deleted: sha256:2ca58c3ef0f2f4b600c12e00a425813a954aaa1cfaf0e7381f93c356794b0bbc
Deleted: sha256:5496f82e318bd89065476e2f26399a910b54a0f573271937c037d3eb5c5c7e8c
Deleted: sha256:da4fe823d6811761a2df825e4703d6d984e7ac400520156eba4bfe11014df8c2
Deleted: sha256:aff94b42c584140951d41f832c3265adc2ecdce2d1cdc74de0483c774d986dda
Deleted: sha256:50c847249b22f605c39871352cf8a51f044f9c56a79453580c93c1ef3d2eea10
Deleted: sha256:85cc57f55fe877d4cf64226494d3c4931c0674f31d2e8b2f225b6fe12ab468aa
Deleted: sha256:1c12885fdc712d6f6fb8beab01622935019acc90f15c9446dd8089d957d5fc84
Deleted: sha256:6a3e88143635807e52ef3f715941e0da9dd67915c48a50a8ee01f8f314d50669
Deleted: sha256:7ddea6e09e44b81f97cd7d2a55f49cfd4a36eed2049f1ea6c0261d54dccea484
Deleted: sha256:373e96462f2ff4cd979f30ae7c36cce65ae749877989edb8a5c8c6fe95e05768
Deleted: sha256:acfcdce21a67e9a3aaa9d49a40d33df17df74a78f5cdddc85c607fc6c274f5bd
Deleted: sha256:b7a76f6eb431d41b8e9620a4dc49a29367e2545c662d54b52b159d784c9cbfea
Deleted: sha256:05f764117388c05707eb4ec41f86e86ec92607b8e4ce23fadcb9407d1c04bcc9
Deleted: sha256:d06b76b5a9511376df0350fa36e8d839c800b4b6aba4db0cba427609dfe35ef3
Deleted: sha256:be578b9fd9685c97b096078673a86ef21301d373d91eacaf7e7307e79e963eae
Deleted: sha256:9d27526a3324ec0604f28abca0494c642329938e037fec88dd75b42c05cda26b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211006124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e628016bfaefed79c9cff4756a17dbfc5c8f0a02b812613169cf37166ac31d39]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211006124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e628016bfaefed79c9cff4756a17dbfc5c8f0a02b812613169cf37166ac31d39])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e628016bfaefed79c9cff4756a17dbfc5c8f0a02b812613169cf37166ac31d39
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e628016bfaefed79c9cff4756a17dbfc5c8f0a02b812613169cf37166ac31d39
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e628016bfaefed79c9cff4756a17dbfc5c8f0a02b812613169cf37166ac31d39].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 47s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lmrharnyeqwl6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #110

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/110/display/redirect?page=changes>

Changes:

[danthev] Switch hintNumWorkers to ValueProvider, switch firstInstant to side

[noreply] [BEAM-11516] Upgrade to pylint 2.11.1, fix warnings (#15612)

[noreply] [BEAM-12979, BEAM-11097] Change cache to store ReStreams, clean up to…

[noreply] [BEAM-3304, BEAM-12513] Trigger changes and Windowing. (#15644)

[noreply] Merge pull request #15618 from [BEAM-12968] [Playground] Create README

[noreply] Merge pull request #15626 from [BEAM-12963] [Playground] Create base

[noreply] Merge pull request #15566 from [BEAM-12925] Correct behavior retrieving

[noreply] [BEAM-12911] Update schema translation (Java, Python) to log more

[noreply] Update CHANGES.md for JdbcIO breaking change (#15651)

[noreply] [BEAM-12513] Add Go SDK metrics content to BPG. (#15650)

[noreply] [BEAM-12996] Improve Error Logging in ConfigBuilder (#15646)

[noreply] [BEAM-13000] Disable Reshuffle Translation in Samza Portable Mode


------------------------------------------
[...truncated 50.54 KB...]
d35dc7f4c79e: Layer already exists
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
2a9ded77b8c8: Pushed
20211005124338: digest: sha256:ab9613b9dc166dd247d63962af738ee43bb2af319c8c3c6671ebebe920f88cd3 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 05, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 05, 2021 12:45:53 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 05, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 05, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 05, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 05, 2021 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 05, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash c2c0a836df596cfe1b9287aa8bd003a2f851522c1c94950c98f05c765b52bc9d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wsCoNt9ZbP4bkoeqi9ADovhRUiwclJUMmPBcdltSvJ0.pb
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 05, 2021 12:45:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 05, 2021 12:45:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a]
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Oct 05, 2021 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-05_05_45_58-6157338180392792576?project=apache-beam-testing
Oct 05, 2021 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-05_05_45_58-6157338180392792576
Oct 05, 2021 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-05_05_45_58-6157338180392792576
Oct 05, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T12:46:07.146Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-gspg. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 05, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:11.314Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.065Z: Expanding SplittableParDo operations into optimizable parts.
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.098Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.154Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.219Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.255Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.334Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.425Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.455Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.490Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.538Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.572Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.602Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.642Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.675Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.708Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.736Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.761Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.783Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.810Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.845Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.879Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.911Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.944Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 05, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:12.979Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 05, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:13.012Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 05, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:13.041Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 05, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:13.080Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 05, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:13.107Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 05, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:13.138Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 05, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:13.546Z: Starting 5 ****s in us-central1-a...
Oct 05, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:34.885Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 05, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:48.957Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 05, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:48.989Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Oct 05, 2021 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:46:59.336Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 05, 2021 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:47:52.464Z: Workers have started successfully.
Oct 05, 2021 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T12:47:52.490Z: Workers have started successfully.
Oct 05, 2021 2:53:15 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Oct 05, 2021 3:19:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-05T15:19:14.210Z: Staged package beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar' is inaccessible.
Oct 05, 2021 3:19:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:19:16.481Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:22:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:22:16.970Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:25:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-05T15:25:14.242Z: Staged package beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar' is inaccessible.
Oct 05, 2021 3:25:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:25:16.929Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:28:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:28:17.058Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:31:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-05T15:31:14.323Z: Staged package beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar' is inaccessible.
Oct 05, 2021 3:31:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:31:17.127Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:34:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:34:17.065Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:37:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-05T15:37:14.371Z: Staged package beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar' is inaccessible.
Oct 05, 2021 3:37:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:37:16.856Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:40:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:40:16.876Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:43:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-05T15:43:14.293Z: Staged package beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar' is inaccessible.
Oct 05, 2021 3:43:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:43:16.866Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:46:17.045Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-05T15:49:14.824Z: Staged package beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar' is inaccessible.
Oct 05, 2021 3:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:49:17.493Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:52:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:52:16.921Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:55:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-05T15:55:14.257Z: Staged package beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-FOu94RegK8zFq2op7b1CeJprMEONKvFs2JFoeFWpqFU.jar' is inaccessible.
Oct 05, 2021 3:55:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:55:16.777Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 3:58:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-05T15:58:16.868Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 05, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T16:00:30.725Z: Cancel request is committed for workflow job: 2021-10-05_05_45_58-6157338180392792576.
Oct 05, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T16:00:30.751Z: Cleaning up.
Oct 05, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T16:00:30.801Z: Stopping **** pool...
Oct 05, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T16:00:30.848Z: Stopping **** pool...
Oct 05, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T16:02:54.028Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 05, 2021 4:02:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-05T16:02:54.061Z: Worker pool stopped.
Oct 05, 2021 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-05_05_45_58-6157338180392792576 finished with status CANCELLED.
Load test results for test (ID): 3fe74c8b-ffc5-49ab-8bc8-d97a737df786 and timestamp: 2021-10-05T12:45:52.765000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11515.863
dataflow_v2_java11_total_bytes_count             2.16827679E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211005124338
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab9613b9dc166dd247d63962af738ee43bb2af319c8c3c6671ebebe920f88cd3
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211005124338]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab9613b9dc166dd247d63962af738ee43bb2af319c8c3c6671ebebe920f88cd3]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211005124338] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab9613b9dc166dd247d63962af738ee43bb2af319c8c3c6671ebebe920f88cd3])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab9613b9dc166dd247d63962af738ee43bb2af319c8c3c6671ebebe920f88cd3
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab9613b9dc166dd247d63962af738ee43bb2af319c8c3c6671ebebe920f88cd3
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ab9613b9dc166dd247d63962af738ee43bb2af319c8c3c6671ebebe920f88cd3].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 41s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/adw7revy7emsm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #109

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/109/display/redirect>

Changes:


------------------------------------------
[...truncated 49.58 KB...]
9f76f57b7c6b: Waiting
5e6a409f30b6: Waiting
8319f73d7f90: Waiting
5c54222c71d2: Waiting
3054497613e6: Waiting
dabfe5b2ea81: Waiting
d35dc7f4c79e: Waiting
0fc2498b65e5: Waiting
3a50f81907ae: Waiting
d08e6b97bf21: Waiting
874ad65f91ea: Waiting
6835ae92c92b: Pushed
b251633a88ca: Pushed
15cc6117fc3d: Pushed
73459cccf4dd: Pushed
ba9f799a23f0: Pushed
9f76f57b7c6b: Pushed
2f286d92c874: Pushed
5c54222c71d2: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
d35dc7f4c79e: Layer already exists
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
8319f73d7f90: Pushed
fc040c8215d7: Pushed
3a50f81907ae: Pushed
2b025f87242a: Pushed
20211004124351: digest: sha256:c9882c75be6c4d85295e8745a86d18134e0050f38a39bd621eab3a8f776d629d size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 04, 2021 12:46:29 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 04, 2021 12:46:29 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 04, 2021 12:46:30 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 04, 2021 12:46:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 04, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 04, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 2 seconds
Oct 04, 2021 12:46:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 04, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash adadd895c2b7f1f2147519d5f0b7a947cf150e113435eb0c116f8138417db950> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ra3YlcK38fIUdRnV8LepR88VDhE0NesMEW-BOEF9uVA.pb
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 04, 2021 12:46:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@aaa0f76, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@340a8894, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a8b9166, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4acc5dff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10c72a6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70e94ecb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56cfe111, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e446d92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f9b467, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d5c2745, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b29496, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1947596f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3078cac, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f6de586, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f2bd6d9]
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 04, 2021 12:46:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@28bdbe88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a87026, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ef60710, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@600f5704, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2503ec73, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@606f81b5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e1fc42f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b21f9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ee8130e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6296474f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4288d98e]
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 04, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Oct 04, 2021 12:46:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-04_05_46_38-593367212600458282?project=apache-beam-testing
Oct 04, 2021 12:46:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-04_05_46_38-593367212600458282
Oct 04, 2021 12:46:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-04_05_46_38-593367212600458282
Oct 04, 2021 12:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-04T12:46:45.865Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-orar. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 04, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:50.275Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 04, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.120Z: Expanding SplittableParDo operations into optimizable parts.
Oct 04, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.159Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.237Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.322Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.353Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.402Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.559Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.612Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.655Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.680Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.705Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.737Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.760Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.792Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.826Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.857Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.890Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.926Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.957Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:51.990Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.040Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.069Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.103Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.132Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.166Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.200Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.235Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.268Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.298Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 04, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:46:52.636Z: Starting 5 ****s in us-central1-a...
Oct 04, 2021 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:47:07.823Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 04, 2021 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:47:35.894Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 04, 2021 12:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:48:31.445Z: Workers have started successfully.
Oct 04, 2021 12:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T12:48:31.479Z: Workers have started successfully.
Oct 04, 2021 1:31:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:53.734Z: Staged package checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-qual-3.10.0-pNyILKaqxJbTM4Hl5esGBMRUg7AEvD6sk2jxuxfLJRI.jar' is inaccessible.
Oct 04, 2021 1:31:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:54.666Z: Staged package google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-1.39.2-arSWeGCSfwUd1_K1deS3qO8182fcLjQIBzqk3uMo4ic.jar' is inaccessible.
Oct 04, 2021 1:31:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:54.706Z: Staged package google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-apache-v2-1.39.2-UtOSCfZav29XfhyrRmrbmfEtHNK6TpjYblAfddf4UM8.jar' is inaccessible.
Oct 04, 2021 1:31:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:54.752Z: Staged package google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-appengine-1.39.2-9jINwNsDWv9uqy4odIVg2nGKtGX7w0gbZT6j8nqlysk.jar' is inaccessible.
Oct 04, 2021 1:31:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:54.794Z: Staged package google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-gson-1.39.2-z4GUT9Qf4NqoabkTvQqq7w5Nj-Z3ShlVQPRU7NPPULY.jar' is inaccessible.
Oct 04, 2021 1:31:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:54.855Z: Staged package google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-jackson2-1.39.2-dKSTp7ol4d-z06Td9xMwbVo3htsAo0idWQYdyM_zIbE.jar' is inaccessible.
Oct 04, 2021 1:31:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:54.901Z: Staged package google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-http-client-protobuf-1.39.2-48CRMH_IJ9gY8fZ3-5ANC0qF7rqbijaXD90xuQTMr5s.jar' is inaccessible.
Oct 04, 2021 1:31:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:56.325Z: Staged package opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-contrib-grpc-util-0.28.0-uRaDRuavZZMwChvCfvdCVKofJAGYhZON2PuFK4d9VfA.jar' is inaccessible.
Oct 04, 2021 1:31:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-04T13:31:56.888Z: Staged package zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zstd-jni-1.4.5-2-lzQxwUtNCahuI7cYQRb8rC2FUB60p0MPfRhczhr0YFA.jar' is inaccessible.
Oct 04, 2021 1:31:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-04T13:31:56.915Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 04, 2021 1:34:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-04T13:34:56.452Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 04, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T16:00:25.031Z: Cancel request is committed for workflow job: 2021-10-04_05_46_38-593367212600458282.
Oct 04, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T16:00:25.101Z: Cleaning up.
Oct 04, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T16:00:25.190Z: Stopping **** pool...
Oct 04, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T16:00:25.265Z: Stopping **** pool...
Oct 04, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T16:02:55.598Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 04, 2021 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-04T16:02:55.640Z: Worker pool stopped.
Oct 04, 2021 4:03:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-04_05_46_38-593367212600458282 finished with status CANCELLED.
Load test results for test (ID): 27e6074e-5b81-4ace-ace8-dc66d233bb9a and timestamp: 2021-10-04T12:46:29.845000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11465.354
dataflow_v2_java11_total_bytes_count             2.61286546E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211004124351
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c9882c75be6c4d85295e8745a86d18134e0050f38a39bd621eab3a8f776d629d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211004124351]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c9882c75be6c4d85295e8745a86d18134e0050f38a39bd621eab3a8f776d629d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211004124351] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c9882c75be6c4d85295e8745a86d18134e0050f38a39bd621eab3a8f776d629d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c9882c75be6c4d85295e8745a86d18134e0050f38a39bd621eab3a8f776d629d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c9882c75be6c4d85295e8745a86d18134e0050f38a39bd621eab3a8f776d629d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c9882c75be6c4d85295e8745a86d18134e0050f38a39bd621eab3a8f776d629d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 39s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5iw7uoimzj6b2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #108

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/108/display/redirect>

Changes:


------------------------------------------
[...truncated 53.94 KB...]
INFO: Adding Read input/StripIds as step s2
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 03, 2021 12:45:42 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53a7a60c]
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 03, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Oct 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-03_05_45_42-3666392489918784440?project=apache-beam-testing
Oct 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-03_05_45_42-3666392489918784440
Oct 03, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-03_05_45_42-3666392489918784440
Oct 03, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T12:45:49.889Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-3613. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:54.375Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.061Z: Expanding SplittableParDo operations into optimizable parts.
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.093Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.158Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.211Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.234Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.303Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.413Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.442Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.472Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.508Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.543Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.575Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.624Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.656Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.694Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.717Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.744Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.778Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 03, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.813Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.849Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.886Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.920Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.953Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:55.988Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:56.032Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:56.085Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:56.119Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:56.157Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:56.191Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 03, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:45:56.593Z: Starting 5 ****s in us-central1-a...
Oct 03, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:46:08.850Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 03, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:46:28.917Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 03, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:46:28.949Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Oct 03, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:46:39.276Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 03, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:47:28.039Z: Workers have started successfully.
Oct 03, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T12:47:28.072Z: Workers have started successfully.
Oct 03, 2021 1:42:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:42:57.719Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Oct 03, 2021 1:42:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:42:57.798Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Oct 03, 2021 1:43:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:42:59.813Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Oct 03, 2021 1:43:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:42:59.871Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Oct 03, 2021 1:43:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:43:00.666Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Oct 03, 2021 1:43:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:43:00.708Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Oct 03, 2021 1:43:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:43:00.807Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Oct 03, 2021 1:43:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T13:43:00.854Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 1:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T13:46:00.372Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 1:48:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:48:57.810Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Oct 03, 2021 1:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:48:57.855Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Oct 03, 2021 1:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:49:00.108Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Oct 03, 2021 1:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:49:00.179Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Oct 03, 2021 1:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:49:01.130Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Oct 03, 2021 1:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:49:01.176Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Oct 03, 2021 1:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:49:01.296Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Oct 03, 2021 1:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T13:49:01.341Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 1:52:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T13:52:00.876Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 1:54:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:54:57.703Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Oct 03, 2021 1:54:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:54:57.799Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Oct 03, 2021 1:55:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:54:59.842Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Oct 03, 2021 1:55:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:54:59.916Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Oct 03, 2021 1:55:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:55:00.781Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Oct 03, 2021 1:55:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:55:00.836Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Oct 03, 2021 1:55:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T13:55:00.936Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Oct 03, 2021 1:55:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T13:55:01.006Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 1:58:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T13:58:00.798Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 2:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T14:00:57.810Z: Staged package common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-config-5.3.2-B_HQvj7POWJFHkM1OEwdiS2Y6zStZi7OXoiw-BJN7Nc.jar' is inaccessible.
Oct 03, 2021 2:00:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T14:00:57.865Z: Staged package common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/common-utils-5.3.2-4Rm-xdfQw0Yp9EH6ym0CZyPloNIGMAJ4b5DO_bByg9c.jar' is inaccessible.
Oct 03, 2021 2:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T14:00:59.666Z: Staged package kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-avro-serializer-5.3.2-7FAoV2nngjZLbCXD3IPBOus_R_2KrM_zffAaNlVOls4.jar' is inaccessible.
Oct 03, 2021 2:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T14:00:59.775Z: Staged package kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-schema-registry-client-5.3.2-0mGQmXEF1E2hwRsR-O-mddgC_DwcNIU4Ba9SQcw2hKY.jar' is inaccessible.
Oct 03, 2021 2:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T14:01:00.616Z: Staged package spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-core-4.3.18.RELEASE-qd-hKocv03STfOrJiihBIfzwVyBqE8q9UnPLPqutu2w.jar' is inaccessible.
Oct 03, 2021 2:01:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T14:01:00.663Z: Staged package spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/spring-expression-4.3.18.RELEASE-aB3PS8pe5rtvJwR2BF3WrB6nO_rdEY2KpkFz7bel1_A.jar' is inaccessible.
Oct 03, 2021 2:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-10-03T14:01:00.792Z: Staged package zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/zkclient-0.10-JumIuLuoOMck_YNQszHui1_8WcOpwHTfEVxMOmyEOHg.jar' is inaccessible.
Oct 03, 2021 2:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T14:01:00.846Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 2:04:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-03T14:04:00.658Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Oct 03, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T16:00:29.798Z: Cancel request is committed for workflow job: 2021-10-03_05_45_42-3666392489918784440.
Oct 03, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T16:00:29.839Z: Cleaning up.
Oct 03, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T16:00:29.929Z: Stopping **** pool...
Oct 03, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T16:00:29.996Z: Stopping **** pool...
Oct 03, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T16:02:58.331Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 03, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-03T16:02:58.373Z: Worker pool stopped.
Oct 03, 2021 4:03:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-03_05_45_42-3666392489918784440 finished with status CANCELLED.
Load test results for test (ID): 49c562dd-2153-4d7c-937f-90954f10e085 and timestamp: 2021-10-03T12:45:37.177000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11531.379
dataflow_v2_java11_total_bytes_count             2.87697566E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211003124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:51d2c1149def1955f76024ebb3750826a7513251b5b43c88e75e59ddb9c580dd
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211003124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:51d2c1149def1955f76024ebb3750826a7513251b5b43c88e75e59ddb9c580dd]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211003124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:51d2c1149def1955f76024ebb3750826a7513251b5b43c88e75e59ddb9c580dd])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:51d2c1149def1955f76024ebb3750826a7513251b5b43c88e75e59ddb9c580dd
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:51d2c1149def1955f76024ebb3750826a7513251b5b43c88e75e59ddb9c580dd
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:51d2c1149def1955f76024ebb3750826a7513251b5b43c88e75e59ddb9c580dd].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 50s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kqhftznrjmsny

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #107

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/107/display/redirect?page=changes>

Changes:

[noreply] Switching to innerText and removing alerts

[noreply] [BEAM-12993] Update to Debezium 1.7.0.Final (#15636)

[noreply] Minor: Fix `Iterable[SplitResultResidual]` type errors (#15634)

[noreply] [BEAM-11217] Implemented metrics filtering (#15482)

[noreply] [BEAM-12513] Schemas and Coders (#15632)


------------------------------------------
[...truncated 49.58 KB...]
640645f1878b: Preparing
29ce3742c9d5: Preparing
52840aa4aa92: Preparing
b28e321914c8: Preparing
791896ea0949: Preparing
d222b75e7c8c: Preparing
874ad65f91ea: Preparing
0fc2498b65e5: Preparing
d08e6b97bf21: Preparing
3054497613e6: Preparing
d35dc7f4c79e: Preparing
dabfe5b2ea81: Preparing
29ce3742c9d5: Waiting
52840aa4aa92: Waiting
5e6a409f30b6: Preparing
b28e321914c8: Waiting
874ad65f91ea: Waiting
0fc2498b65e5: Waiting
791896ea0949: Waiting
d222b75e7c8c: Waiting
d08e6b97bf21: Waiting
3054497613e6: Waiting
d35dc7f4c79e: Waiting
ab6a9c92964e: Waiting
640645f1878b: Waiting
dabfe5b2ea81: Waiting
5e6a409f30b6: Waiting
2c2a8799badc: Pushed
53c97e82a729: Pushed
fadf7501ed55: Pushed
3fde5d19938d: Pushed
ab6a9c92964e: Pushed
dfd8c9ec8f3f: Pushed
29ce3742c9d5: Pushed
52840aa4aa92: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
791896ea0949: Pushed
640645f1878b: Pushed
3054497613e6: Layer already exists
d35dc7f4c79e: Layer already exists
dabfe5b2ea81: Layer already exists
5e6a409f30b6: Layer already exists
d222b75e7c8c: Pushed
b28e321914c8: Pushed
20211002124334: digest: sha256:ed13aef81494bd85134a92db6119708c986b99a1f942cff69996c73fd69161d1 size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 02, 2021 12:45:35 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 02, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 02, 2021 12:45:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 02, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 02, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 02, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading /home/jenkins/.m2/repository/org/apache/beam/beam-vendor-grpc-1_36_0/0.2/beam-vendor-grpc-1_36_0-0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-cBdYf-JPXEZ1XnZqk0m4-VlBsCKU_yOoV0qLHUFCl2c.jar
Oct 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 194 files cached, 1 files newly uploaded in 1 seconds
Oct 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash ce3df63d83166cf44021866553c29fac922b02d6e869a1bf66db1c0754bb456d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zj32PYMWbPRAIYZlU8KfrJIrAtboaaG_ZtscB1S7RW0.pb
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 02, 2021 12:45:42 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@aaa0f76, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@340a8894, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a8b9166, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4acc5dff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10c72a6f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70e94ecb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56cfe111, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e446d92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57f9b467, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d5c2745, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44b29496]
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 02, 2021 12:45:42 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2]
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Oct 02, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-02_05_45_42-13751100958647493540?project=apache-beam-testing
Oct 02, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-02_05_45_42-13751100958647493540
Oct 02, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-02_05_45_42-13751100958647493540
Oct 02, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-02T12:45:49.687Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-cx6n. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:54.585Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.331Z: Expanding SplittableParDo operations into optimizable parts.
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.358Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.411Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.478Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.500Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.550Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.651Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.686Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.714Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.741Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.772Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.804Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.835Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.858Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.883Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.916Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.944Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.970Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:55.992Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.015Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.048Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.089Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.116Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.145Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.177Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.212Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.237Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.261Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.289Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 02, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:45:56.621Z: Starting 5 ****s in us-central1-a...
Oct 02, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:46:28.183Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 02, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:46:37.891Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 02, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:46:37.935Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Oct 02, 2021 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:46:48.264Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 02, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:47:40.912Z: Workers have started successfully.
Oct 02, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T12:47:40.939Z: Workers have started successfully.
Oct 02, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T16:00:24.861Z: Cancel request is committed for workflow job: 2021-10-02_05_45_42-13751100958647493540.
Oct 02, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T16:00:24.924Z: Cleaning up.
Oct 02, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T16:00:24.996Z: Stopping **** pool...
Oct 02, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T16:00:25.060Z: Stopping **** pool...
Oct 02, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T16:02:50.544Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 02, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-02T16:02:50.575Z: Worker pool stopped.
Oct 02, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-02_05_45_42-13751100958647493540 finished with status CANCELLED.
Load test results for test (ID): d6cdfdb6-008c-4f4f-9cfa-5f603b82dbc6 and timestamp: 2021-10-02T12:45:36.151000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11129.238
dataflow_v2_java11_total_bytes_count             2.34195967E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211002124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed13aef81494bd85134a92db6119708c986b99a1f942cff69996c73fd69161d1
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211002124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed13aef81494bd85134a92db6119708c986b99a1f942cff69996c73fd69161d1]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211002124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed13aef81494bd85134a92db6119708c986b99a1f942cff69996c73fd69161d1])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed13aef81494bd85134a92db6119708c986b99a1f942cff69996c73fd69161d1
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed13aef81494bd85134a92db6119708c986b99a1f942cff69996c73fd69161d1
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed13aef81494bd85134a92db6119708c986b99a1f942cff69996c73fd69161d1].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 40s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/43klybgxzcpt2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #106

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/106/display/redirect?page=changes>

Changes:

[stranniknm] [BEAM-12951]: Create initial structure for Playground application

[dpcollins] [BEAM-12908] Change to use PubsubSignal for information propagation so

[Robert Bradshaw] Preserve more types in transform replacement.

[rohde.samuel] Fix BEAM-12984

[Robert Bradshaw] Update test to reflect preserved type hint.

[noreply] Update Beam glossary (#15619)

[noreply] [BEAM-11985] Python Bigtable - Implement IO Request Count metrics

[noreply] [BEAM-9918] Make TryCrossLanguage match non Try API (#15633)

[noreply] [BEAM-12957] Add support for pyarrow 5.x (#15588)


------------------------------------------
[...truncated 50.14 KB...]
3d2dc6a1e0cc: Pushed
6e6702e72581: Pushed
747315a15ae9: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
d35dc7f4c79e: Layer already exists
dabfe5b2ea81: Layer already exists
2292b797a958: Pushed
3eda0e4baae4: Pushed
5e6a409f30b6: Layer already exists
806581f04ff7: Pushed
85a429cff7e2: Pushed
20211001124333: digest: sha256:dd477c33c5beb826333aa466e18538363e2c3fe7f2cf4a85ac8e44997f97253b size: 4311

> Task :sdks:java:testing:load-tests:run
Oct 01, 2021 12:45:23 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Oct 01, 2021 12:45:24 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Oct 01, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 01, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Oct 01, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Oct 01, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <108949 bytes, hash 1b4010c853ec5001e5a693f58392bd22dd98accd9e44d7b7959cfc9a7a297b01> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-G0AQyFPsUAHlppP1g5K9It2YrM2eRNe3lZz8mnopewE.pb
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 01, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Oct 01, 2021 12:45:29 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Oct 01, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Oct 01, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Oct 01, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Oct 01, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Oct 01, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-01_05_45_30-5041997696408205737?project=apache-beam-testing
Oct 01, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-10-01_05_45_30-5041997696408205737
Oct 01, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-01_05_45_30-5041997696408205737
Oct 01, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-10-01T12:45:36.699Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-10-772t. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Oct 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:41.362Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.196Z: Expanding SplittableParDo operations into optimizable parts.
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.231Z: Expanding CollectionToSingleton operations into optimizable parts.
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.299Z: Expanding CoGroupByKey operations into optimizable parts.
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.364Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.402Z: Expanding GroupByKey operations into streaming Read/Write steps
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.460Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.551Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.584Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.612Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.636Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.676Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.707Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.729Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.752Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.779Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.817Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.869Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.903Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.930Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.966Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:42.992Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.023Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.056Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.086Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.153Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.193Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.226Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.252Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Oct 01, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.287Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Oct 01, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:43.698Z: Starting 5 ****s in us-central1-a...
Oct 01, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:45:49.563Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 01, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:46:24.841Z: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 01, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:46:24.900Z: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
Oct 01, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:46:35.247Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Oct 01, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:47:26.897Z: Workers have started successfully.
Oct 01, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T12:47:26.935Z: Workers have started successfully.
Oct 01, 2021 1:59:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T13:59:13.320Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Oct 01, 2021 1:59:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T13:59:17.037Z: Worker configuration: e2-standard-2 in us-central1-a.
Oct 01, 2021 1:59:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T13:59:21.992Z: Workers have started successfully.
Oct 01, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T16:00:32.460Z: Cancel request is committed for workflow job: 2021-10-01_05_45_30-5041997696408205737.
Oct 01, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T16:00:32.529Z: Cleaning up.
Oct 01, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T16:00:32.672Z: Stopping **** pool...
Oct 01, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T16:00:32.718Z: Stopping **** pool...
Oct 01, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T16:02:51.220Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 01, 2021 4:02:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-10-01T16:02:51.266Z: Worker pool stopped.
Oct 01, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-10-01_05_45_30-5041997696408205737 finished with status CANCELLED.
Load test results for test (ID): fb59a485-203f-4f1e-9098-c28e1c02c18f and timestamp: 2021-10-01T12:45:24.398000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11531.837
dataflow_v2_java11_total_bytes_count             2.43221006E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211001124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dd477c33c5beb826333aa466e18538363e2c3fe7f2cf4a85ac8e44997f97253b
Deleted: sha256:c01e8807dba0aab5a842cbf698b778c444509fcd0dec2b2a119ca3add85024bd
Deleted: sha256:31dfd48fe9b93e69db6ea66c559a9b21d73a2f725187c45bda42f1a43fbc8e0d
Deleted: sha256:f431a51fde4110ef9eb0558f4033d8d0c629c8d3379ac5ecc74771c7bae11666
Deleted: sha256:0fe799e990d18863c8c3f8005b89c6a2fcc034205fcc41d08d58836ae9b735c2
Deleted: sha256:5507d971e203d2665cbbed5aa625501a340d9b7b2f430b7d0586afad3ddda6ea
Deleted: sha256:d17ecca3fe50e37de306278d3e0015691cc182fc06947e50a4f210f567753996
Deleted: sha256:42682df17fb6d59c71a2a771fa9d5dbb8a1a4dccdb7b3f1ad183cc2ae4162544
Deleted: sha256:a67af73edaab39b6681540f60ced171487a5819ae79d97d9b354e2d4cbe2e5cb
Deleted: sha256:11591c7607b82920587f37c2c7ffc783b3ae5291e6a9ae7635e345484db9c966
Deleted: sha256:87a161edbb103585e5d0d343db07c80193c8efeb4c50f0feb7a052c4817901ad
Deleted: sha256:77c9bc2c908ea28df6c216590ae706b8f984752cceaddb9369c240269310a123
Deleted: sha256:9b5a76fc3eb6551048994037c0ceefb28a1c0cf5e1156af37a286c1c3a7b7082
Deleted: sha256:4f66acc753892fbeac4a54e7c974e476788f9876b87c5d5094d670d99b9e551e
Deleted: sha256:1bea7a1dc3b875a77fb3cf3af62a3fb60756fda31895a0e7b8c5387c17f50694
Deleted: sha256:a860b823499c0bb1181bbf4c9668eeb03655778b5c7a491231b67efe37f04d26
Deleted: sha256:342e0bc8c4f7bba121a3fedc221e43ff54e66e9189167c2d0f486dccb7b278ca
Deleted: sha256:04b34b54af1820ef757b047ec9ae096ebbb697718c5177f7c9dc56fe5428e5ba
Deleted: sha256:d3f85266edb2d64348af2a7b0bd960d8a9787994986521d03dd7c490600996df
Deleted: sha256:43b8cb7f9a927f016e291300173b6076a4c7af606e3af951e3ede2ce31e5622f
Deleted: sha256:9355f50c015c060021d2f4ba8a97685aa982b83d6995365a87a3924dff3a0a2e
Deleted: sha256:cd7f70c4017ea618dedaf045bf5f9c74d7a5608e99baf91d4ef4b35e030800ae
Deleted: sha256:3e27d3af5e7a0a467e94d34c5687ab3f678d03c40bef575f16aae1b6dc2707bb
Deleted: sha256:d93a6e9dd72cd6957e527137bcc6661826942f253efaefc568cc3a373edc81cd
Deleted: sha256:5f8b9633bfbbafb3084c1db94e0361c0042e02ca12d73f2e5353b969ca475e85
Deleted: sha256:0de755e6293192f8c30be83e8031207668e7ad0200d7d66db6e8cf710d5b72bd
Deleted: sha256:b4ed11189636cc3cfeb2fbb9c12a5a908fd3c9ef12f37ef777749bcf33de7c93
Deleted: sha256:2f984c5689db1c797bc425eb8f3289934dfd45b1b503bde5555dabc2b2dc9297
Deleted: sha256:ed4e6008df64cf5464d9421fd9dce7b47f257e8ffdc8a8149f911628563ab1ac
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211001124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dd477c33c5beb826333aa466e18538363e2c3fe7f2cf4a85ac8e44997f97253b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211001124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dd477c33c5beb826333aa466e18538363e2c3fe7f2cf4a85ac8e44997f97253b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dd477c33c5beb826333aa466e18538363e2c3fe7f2cf4a85ac8e44997f97253b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dd477c33c5beb826333aa466e18538363e2c3fe7f2cf4a85ac8e44997f97253b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:dd477c33c5beb826333aa466e18538363e2c3fe7f2cf4a85ac8e44997f97253b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 42s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/m6kjyg6hv5jv4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #105

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/105/display/redirect?page=changes>

Changes:

[kileysok] Update windmill state map to resolve *IfAbsent methods immediately

[samuelw] [BEAM-12942] Validate pubsub messages before they are published in

[clairem] [BEAM-12628] make useReflectApi default to true

[noreply] [BEAM-12856] Change hard-coded limits for reading from a UnboundedReader

[piotr.szczepanik] [BEAM-12356] Fixed last non-cached usage of DatasetService in BigQuery

[noreply] [BEAM-12977] Translates Reshuffle in Portable Mode with Samza native

[noreply] [BEAM-12982] Help users debug which JvmInitializer is running and when.


------------------------------------------
[...truncated 53.44 KB...]
INFO: Adding Read input/StripIds as step s2
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 30, 2021 12:45:24 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43fda8d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49d831c2]
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 30, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 30, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-30_05_45_25-10486700576588734041?project=apache-beam-testing
Sep 30, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-30_05_45_25-10486700576588734041
Sep 30, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-30_05_45_25-10486700576588734041
Sep 30, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-30T12:45:33.878Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-qp0w. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:38.763Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.569Z: Expanding SplittableParDo operations into optimizable parts.
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.611Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.684Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.758Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.786Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.833Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.942Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:39.969Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.000Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.026Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.061Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.085Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.107Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.144Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.177Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.221Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.257Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 30, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.291Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.330Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.360Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.390Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.445Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.585Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.723Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.778Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.821Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.849Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.883Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:40.913Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 30, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:45:41.270Z: Starting 5 ****s in us-central1-a...
Sep 30, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:46:06.036Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 30, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:46:17.937Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 30, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:46:17.963Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Sep 30, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:46:28.302Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 30, 2021 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:47:26.885Z: Workers have started successfully.
Sep 30, 2021 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T12:47:26.919Z: Workers have started successfully.
Sep 30, 2021 2:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:41.426Z: Staged package amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-client-1.14.2-qtBh3OuKuLE7S26ITTJzYsoYfG-mWjqbGmv3VJPhec8.jar' is inaccessible.
Sep 30, 2021 2:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:41.579Z: Staged package annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar' is inaccessible.
Sep 30, 2021 2:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:41.803Z: Staged package avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/avro-1.8.2-91SggwzmelqfpnpU7BXRA-8V4chQ17Jvr3tkfu3cgtM.jar' is inaccessible.
Sep 30, 2021 2:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:42.460Z: Staged package beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-bytebuddy-1_11_0-0.1-SneHjzQgCpxxLkvyN8dugqO_sTzwXJ2995kqFu4l2i4.jar' is inaccessible.
Sep 30, 2021 2:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:42.545Z: Staged package beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-guava-26_0-jre-0.1-8quizfXTrmcvJyOXi3zWDFtP5EofgqjksY5sIbsBqws.jar' is inaccessible.
Sep 30, 2021 2:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:42.635Z: Staged package checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/checker-compat-qual-2.5.5-EdE0skXpysxHRRTS1mtbhhj4A5oUZc3FW7wLNOAAi3o.jar' is inaccessible.
Sep 30, 2021 2:51:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:42.840Z: Staged package commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-lang3-3.9-3i4dzc8--ReozoWGYaBnJqmpRPKOM61_ngi-pE3DwjA.jar' is inaccessible.
Sep 30, 2021 2:51:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:42.939Z: Staged package commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-logging-1.2-2t3qHqC-D1aXirMAa4rJKDSv7vvZt-TmMW_KV98PpjY.jar' is inaccessible.
Sep 30, 2021 2:51:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:43.094Z: Staged package commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-math3-3.6.1-HlbXsFjSi2Wr0la4RY44hbZ0wdWI-kPNfRy7nH7yswg.jar' is inaccessible.
Sep 30, 2021 2:51:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:43.205Z: Staged package datastore-v1-proto-client-1.6.3-8GhVpKiAAK6wztlpNJ48AH4jyyB1KYo-5qJLXiBC6cw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/datastore-v1-proto-client-1.6.3-8GhVpKiAAK6wztlpNJ48AH4jyyB1KYo-5qJLXiBC6cw.jar' is inaccessible.
Sep 30, 2021 2:51:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.370Z: Staged package gson-2.8.6-yPtIOQVNKAswM_gA0fWpfeLwKOuLoutFitKH5Tbz8l8.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/gson-2.8.6-yPtIOQVNKAswM_gA0fWpfeLwKOuLoutFitKH5Tbz8l8.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.493Z: Staged package hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.545Z: Staged package hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/hamcrest-core-2.1-4JEJ5UoonYhQa5v-yYfd0Zn0IXyUZBMmaDUbmk8Avuk.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.665Z: Staged package j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.748Z: Staged package jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-RAqctcqVshX5U9OiCmsaENofCbUpqd3qX4pJBd2rT1o.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.839Z: Staged package jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-mapper-asl-1.9.13-dOegenby7breKTEqWi68z6AZEovAIezjhW12GX6b4MI.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.897Z: Staged package jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.activation-api-1.2.2-oYepORA671hJp6-EvX4nvi0SDEEK8pFDc3X_4GH08J0.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:44.951Z: Staged package jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:45.006Z: Staged package javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/javax.annotation-api-1.3.2-4EulGVvNVV3JVlD3zGFNFR5LzVLSmhC4qiGX86uJq5s.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:45.076Z: Staged package json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/json-20200518-D_MObagvFa-vDna9bplqerVWaFEWaoP5c8ZKmg-UbS0.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-30T14:51:45.122Z: Staged package jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jsr305-3.0.2-dmrSoHg_JoeWLIrXTO7MOKKLn3Ki0IXuQ4t4E-ko0Mc.jar' is inaccessible.
Sep 30, 2021 2:51:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-30T14:51:45.986Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T16:00:35.199Z: Cancel request is committed for workflow job: 2021-09-30_05_45_25-10486700576588734041.
Sep 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T16:00:35.415Z: Cleaning up.
Sep 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T16:00:35.501Z: Stopping **** pool...
Sep 30, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T16:00:35.559Z: Stopping **** pool...
Sep 30, 2021 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T16:03:04.190Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 30, 2021 4:03:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-30T16:03:04.230Z: Worker pool stopped.
Sep 30, 2021 4:03:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-30_05_45_25-10486700576588734041 finished with status CANCELLED.
Load test results for test (ID): c0aa9c2c-2d92-4cee-882b-2e297f9c60aa and timestamp: 2021-09-30T12:45:20.300000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11556.755
dataflow_v2_java11_total_bytes_count             2.99148595E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210930124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4557d1d81f235f0b2d1d85ff19eeb6e31b86e1e220d14d2d4d1b97b0484a4fc7
Deleted: sha256:4a921f3ddf13e4dde2bf38841c800f227027888eec37f41d35793217b0d911b6
Deleted: sha256:1483bac0782df00e251f4a7ac101080f704f38831282d1bf3a082e4267b6a863
Deleted: sha256:88fe30db21d041f8a1ab60dd0b4d4e8018f335cb3236afd0735437a02c18770e
Deleted: sha256:bb0393cfcb74572a3aa4a152cd1a0d1251b7a4510c26359b1407babf77645b1d
Deleted: sha256:77e421a2e2054c1c73c6e0272d0cde0edfe748ac03063c5382d7071edcb9d184
Deleted: sha256:b03df5f20755da811b68c034e3b920590d5afd4ff76db0397b085ade4b6297ea
Deleted: sha256:ec44ead991e8a26087a06d0460bb94a055593c9a6830b737b1f9a7b3d4923fbb
Deleted: sha256:d1b7a8a07febb4057f92dc8951e5583accb2792e793fecfab0a41afad2b53b61
Deleted: sha256:a06eaa286ccf1b79502e7377c41dae9ec0914ff2f8566de4dc6d235d55c1e540
Deleted: sha256:aa19a9a178ad436341d3fad631ca1de8e60f87cd9def0adebdaf342e767bfcf5
Deleted: sha256:d3affdddaf41ae8ae76b88fea030ee028eea135a6fb10e5f107cff13f2e97335
Deleted: sha256:70a89a6ac16cdfcd9de9a74b6f83b8aaa57d6d518b4f500a327321f70b29bc99
Deleted: sha256:e57de2b7f10f7ffb5a09929446b4f1a6c719683f58b22785bf810cf358cfeca3
Deleted: sha256:96c48d3376f31852f54245221807c751c35f6e7454c5b06ec4888e7945a4a1d4
Deleted: sha256:649c4d442a244a7907ee8d1a9302a52b645a3a8ff290238d023eeeed3517aa8a
Deleted: sha256:1c2304e0befd96984e05865cb4e650e1d036eadfc1de279edcf1c0a339de1a05
Deleted: sha256:c028a26b3d101b1bf005f819338f2dbac406704988c496aeb8bebf564a3fb36e
Deleted: sha256:c4301378493c8c4a7590ed9c01979a18eb41eeb72b1226bce1c98658a7c6192d
Deleted: sha256:62e6a0d8c80089c5d27bed21b1a63872f1a98d679ad72ac23aa32bab77f73a5a
Deleted: sha256:dc5c455a042545bac84356892998f30f257649a5de2e8f5af6330e94a45198dd
Deleted: sha256:25b5859570cbeeace215ba45d6035b802401e86fde4023ce77c781c7475b98e9
Deleted: sha256:71e9d045fcbc4b4e5610e84f6d3fd7566a0f53cf6c3ff67396050991212cf144
Deleted: sha256:0ac16a6a0ef806998fa7fd784f02f140c45251b666bd31e1270d6fac04db8c87
Deleted: sha256:7ddcb572f9f31ece0acebf3f1658b6f6a8be8e96ed56176b6f412caa16d4acc8
Deleted: sha256:80b31ec477f265a13380faf29ab4dda282e8d21cf0830d2610ae244340df1456
Deleted: sha256:0b6b291060d2f285a1c5f03263d18a976171f4267086fd714ababd5bcddd10d7
Deleted: sha256:8addd63e2f780eeb0e61d7ddb9518b17a9a0acb6aea405cb9afd1e2445a66e3d
Deleted: sha256:591ea11b3c177685ef5c13b3a4805a434476f4d6115117d5e14ee9bdb967bc86
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210930124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4557d1d81f235f0b2d1d85ff19eeb6e31b86e1e220d14d2d4d1b97b0484a4fc7]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210930124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4557d1d81f235f0b2d1d85ff19eeb6e31b86e1e220d14d2d4d1b97b0484a4fc7])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4557d1d81f235f0b2d1d85ff19eeb6e31b86e1e220d14d2d4d1b97b0484a4fc7
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4557d1d81f235f0b2d1d85ff19eeb6e31b86e1e220d14d2d4d1b97b0484a4fc7
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4557d1d81f235f0b2d1d85ff19eeb6e31b86e1e220d14d2d4d1b97b0484a4fc7].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 54s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/seu2yzalrbneu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #104

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/104/display/redirect?page=changes>

Changes:

[chamikaramj] Sets the correct coder in 'ConstantTableDestinations' when using BQ

[Luke Cwik] [BEAM-12974] Migrate off of deprecated package for MockitoJUnitRunner

[noreply] [BEAM-12593] Verify DataFrame API on pandas 1.3 (with container update)

[noreply] [BEAM-12973] Print Go Test and Script info. (#15604)

[noreply] [BEAM-10913] - Installing and persisting Grafana plugin in kubernetes

[noreply] [BEAM-11129] Add namespace and key to portable display data (#15564)

[noreply] [BEAM-12906] Add a `dataframe` extra for installing a pandas version


------------------------------------------
[...truncated 84.97 KB...]
SEVERE: 2021-09-29T15:30:47.200Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Sep 29, 2021 3:30:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.237Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Sep 29, 2021 3:30:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.281Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.329Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.407Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.464Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.507Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.614Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:47.661Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:30:48.078Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Sep 29, 2021 3:30:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:30:48.269Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:33:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:33:47.866Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:44.038Z: Staged package auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar' is inaccessible.
Sep 29, 2021 3:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:44.943Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Sep 29, 2021 3:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:45.103Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.117Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.154Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.194Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.242Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.300Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.344Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.385Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.474Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.510Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.576Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.640Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.739Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:47.782Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:36:48.302Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Sep 29, 2021 3:36:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:36:48.446Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:39:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:39:47.714Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:42:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:44.077Z: Staged package auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar' is inaccessible.
Sep 29, 2021 3:42:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:45.088Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Sep 29, 2021 3:42:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:45.231Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Sep 29, 2021 3:42:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.166Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Sep 29, 2021 3:42:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.212Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.262Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.304Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.348Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.377Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.418Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.480Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.525Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.584Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.650Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.751Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:47.805Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:42:48.188Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Sep 29, 2021 3:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:42:48.344Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:45:47.713Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:48:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:44.075Z: Staged package auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar' is inaccessible.
Sep 29, 2021 3:48:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:44.967Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Sep 29, 2021 3:48:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:45.127Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.199Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.253Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.297Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.346Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.392Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.427Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.463Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.496Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.538Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.596Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Sep 29, 2021 3:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.636Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Sep 29, 2021 3:48:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.732Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Sep 29, 2021 3:48:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:47.772Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Sep 29, 2021 3:48:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:48:48.168Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Sep 29, 2021 3:48:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:48:48.312Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:51:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:51:47.957Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:54:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:44.047Z: Staged package auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/auto-value-annotations-1.8.1-N-wJtH1-01qZ0Tkn21yG_JBx9iD5Q-rV11cURpgxCFI.jar' is inaccessible.
Sep 29, 2021 3:54:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:44.907Z: Staged package commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/commons-compress-1.21-auz9VFlyillWAc-gcljRMZcv_Dm0kutIvdWWV3ovJEo.jar' is inaccessible.
Sep 29, 2021 3:54:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:45.051Z: Staged package conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/conscrypt-openjdk-uber-2.5.1-AfnHQstZKhUeLmK9U5eomAYoqWcAH82s1KpHRGeGhfM.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:46.867Z: Staged package netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-buffer-4.1.52.Final-QcoQNa7m4PgXdZfQp2EKWqLG5vrHRKyB_eTl5Y9jlnU.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:46.901Z: Staged package netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-4.1.52.Final-PH543M5-NT2GrH26LCUatc3dTrhuqoic_kIpeCuCjlI.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:46.936Z: Staged package netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http-4.1.52.Final-tdoo0OqHiwfTERrPXk-vQSiO8jArJkFBkP3Rp9lzyAw.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:46.979Z: Staged package netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-http2-4.1.52.Final-61J6QtIonRLe5tR1PKJxWXije7tBxxkVzk6VOcSfk6E.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.035Z: Staged package netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-codec-socks-4.1.52.Final-_sxXfgu3ch1naM_BhG7hcpEdFquA372ZldllelUdTqc.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.071Z: Staged package netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-common-4.1.52.Final-DR8QHk231TDtBNfKvVfZXAdQ8Fm7az-t_tDIDUc0xxc.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.112Z: Staged package netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-4.1.52.Final-LNwh-_rgSUDIWSkDl3nJP1H2GPXvshYHkxnc_PMqVcE.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.151Z: Staged package netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-handler-proxy-4.1.52.Final-6NB4Y860vUgG4lWh1TLIZQ8POy1NAGrW0hgS6UG0L9w.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.194Z: Staged package netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-resolver-4.1.52.Final-butyojOXnLRbC01tRsWkEzLFfAz4qAOx78YNu-y6CaY.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.254Z: Staged package netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/netty-transport-4.1.52.Final-mj5vjA5V3jY-seoQ_ngXl-yjlOYhht8q4LTrK84LRUE.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.293Z: Staged package opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/opencensus-api-0.28.0-DBcj8_bTBhMjhFzouIs1_dpQCBLgp1uOtfzErYyHGpU.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.371Z: Staged package paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/paranamer-2.8-aIyxGKYCHYGROOhVIIyVYDFoi-S0eiS7YVvsxjrO3wc.jar' is inaccessible.
Sep 29, 2021 3:54:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.411Z: Staged package perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/perfmark-api-0.23.0-xwW1wQwY_zAyuegXQrwvaw5WB_am38DIrQz_ddSRMEI.jar' is inaccessible.
Sep 29, 2021 3:54:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T15:54:47.835Z: Staged package snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snappy-java-1.1.8.4-JMTR_B6J4HgzGrj0AamcrWhZm95KLkUWBCy1SMUbHD4.jar' is inaccessible.
Sep 29, 2021 3:54:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:54:47.973Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 3:57:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-29T15:57:47.424Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 29, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-29T16:00:26.162Z: Cancel request is committed for workflow job: 2021-09-29_05_45_29-3890809183779990448.
Sep 29, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-29T16:00:26.197Z: Cleaning up.
Sep 29, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-29T16:00:26.282Z: Stopping **** pool...
Sep 29, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-29T16:00:26.325Z: Stopping **** pool...
Sep 29, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-29T16:00:34.655Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
Sep 29, 2021 4:02:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-29T16:02:47.909Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 29, 2021 4:02:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-29T16:02:47.972Z: Worker pool stopped.
Sep 29, 2021 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-29_05_45_29-3890809183779990448 finished with status CANCELLED.
Load test results for test (ID): 685e7028-0ab6-48c7-bc07-c3efc17bedcc and timestamp: 2021-09-29T12:45:24.386000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11548.812
dataflow_v2_java11_total_bytes_count             2.64142272E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210929124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1425d4294767e687ebbd35f632b2c2c6b5e12a0fbd0da744d90ca18892ae676
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210929124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1425d4294767e687ebbd35f632b2c2c6b5e12a0fbd0da744d90ca18892ae676]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210929124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1425d4294767e687ebbd35f632b2c2c6b5e12a0fbd0da744d90ca18892ae676])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1425d4294767e687ebbd35f632b2c2c6b5e12a0fbd0da744d90ca18892ae676
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1425d4294767e687ebbd35f632b2c2c6b5e12a0fbd0da744d90ca18892ae676
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1425d4294767e687ebbd35f632b2c2c6b5e12a0fbd0da744d90ca18892ae676].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 50s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/57qasuxi7v6q6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #103

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/103/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-12691] FieldAccessDescriptor for BeamCalcRel

[aydar.zaynutdinov] Init basic go backend structure

[noreply] Update container tags used by unreleased SDKs.

[aydar.zaynutdinov] Add .gitkeep file as an exclusion

[yileiyang] Remove encoding= from the json.loads call.

[noreply] [BEAM-12798] Add configurable combiner packing limit (#15391)

[noreply] [BEAM-12926] Translates Reshuffle with Samza native repartition operator

[noreply] [BEAM-11007] Allow nbconvert 6.x (#15595)

[noreply] [BEAM-12945] Fix crashes when importing the DataFrame API with pandas

[noreply] [BEAM-12611] Capture a greater proporition of logs associated to an


------------------------------------------
[...truncated 48.98 KB...]
102bc031837a: Preparing
fcd0be6cd8e4: Preparing
a98a451ba0e5: Preparing
dbc817088159: Preparing
8330a8eb6279: Preparing
6e85f2bf577d: Preparing
874ad65f91ea: Preparing
0fc2498b65e5: Preparing
d08e6b97bf21: Preparing
3054497613e6: Preparing
d35dc7f4c79e: Preparing
dabfe5b2ea81: Preparing
5e6a409f30b6: Preparing
102bc031837a: Waiting
0fc2498b65e5: Waiting
fcd0be6cd8e4: Waiting
d08e6b97bf21: Waiting
d829bd2ded64: Waiting
3054497613e6: Waiting
a98a451ba0e5: Waiting
d35dc7f4c79e: Waiting
dabfe5b2ea81: Waiting
874ad65f91ea: Waiting
5e6a409f30b6: Waiting
dbc817088159: Waiting
8330a8eb6279: Waiting
6e85f2bf577d: Waiting
0ad119aed9fc: Pushed
7ff1eed6bbc9: Pushed
dcfa9e2c6b21: Pushed
501342684d0d: Pushed
d829bd2ded64: Pushed
ae46df76eb5b: Pushed
fcd0be6cd8e4: Pushed
a98a451ba0e5: Pushed
874ad65f91ea: Layer already exists
0fc2498b65e5: Layer already exists
d08e6b97bf21: Layer already exists
3054497613e6: Layer already exists
8330a8eb6279: Pushed
d35dc7f4c79e: Layer already exists
102bc031837a: Pushed
dabfe5b2ea81: Layer already exists
6e85f2bf577d: Pushed
5e6a409f30b6: Layer already exists
dbc817088159: Pushed
20210928124333: digest: sha256:26e03b4ba2a39ade88921f2105ec0c0daa1bd2c2564f51ca086590a919cf1576 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 28, 2021 12:45:48 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 28, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 28, 2021 12:45:49 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 28, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 28, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 28, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 28, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 28, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash d93f0953b14850cca27eac110558c862f5d119db6d9c263859446a8949dce14a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2T8JU7FIUMyifqwRBVjIYvXRGdttnCY4WURqiUnc4Uo.pb
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 28, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 28, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 28, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 28, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-28_05_45_53-13740441012094939605?project=apache-beam-testing
Sep 28, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-28_05_45_53-13740441012094939605
Sep 28, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-28_05_45_53-13740441012094939605
Sep 28, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-28T12:46:02.361Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-lnn6. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 28, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:05.348Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 28, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.194Z: Expanding SplittableParDo operations into optimizable parts.
Sep 28, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.229Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 28, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.289Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.358Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.385Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.452Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.547Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.592Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.627Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.660Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.696Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.727Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.759Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.789Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.822Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.853Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.879Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.903Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.937Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:06.968Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.005Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.037Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.084Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.109Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.137Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.174Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.200Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.230Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.263Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 28, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:07.641Z: Starting 5 ****s in us-central1-a...
Sep 28, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:13.916Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 28, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:49.160Z: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 28, 2021 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:49.188Z: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
Sep 28, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:46:59.508Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 28, 2021 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:47:52.450Z: Workers have started successfully.
Sep 28, 2021 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T12:47:52.486Z: Workers have started successfully.
Sep 28, 2021 3:46:28 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Sep 28, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T16:00:32.968Z: Cancel request is committed for workflow job: 2021-09-28_05_45_53-13740441012094939605.
Sep 28, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T16:00:33.043Z: Cleaning up.
Sep 28, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T16:00:33.104Z: Stopping **** pool...
Sep 28, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T16:00:33.184Z: Stopping **** pool...
Sep 28, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T16:02:51.473Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 28, 2021 4:02:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-28T16:02:51.515Z: Worker pool stopped.
Sep 28, 2021 4:02:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-28_05_45_53-13740441012094939605 finished with status CANCELLED.
Load test results for test (ID): 8bc866d9-07ee-444d-9b12-1c9236191504 and timestamp: 2021-09-28T12:45:48.907000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11523.763
dataflow_v2_java11_total_bytes_count             2.80484304E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210928124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26e03b4ba2a39ade88921f2105ec0c0daa1bd2c2564f51ca086590a919cf1576
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210928124333]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26e03b4ba2a39ade88921f2105ec0c0daa1bd2c2564f51ca086590a919cf1576]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210928124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26e03b4ba2a39ade88921f2105ec0c0daa1bd2c2564f51ca086590a919cf1576])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26e03b4ba2a39ade88921f2105ec0c0daa1bd2c2564f51ca086590a919cf1576
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26e03b4ba2a39ade88921f2105ec0c0daa1bd2c2564f51ca086590a919cf1576
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26e03b4ba2a39ade88921f2105ec0c0daa1bd2c2564f51ca086590a919cf1576].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 43s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/n3zkw263wznzy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #102

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/102/display/redirect>

Changes:


------------------------------------------
[...truncated 49.04 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
e1d8cf3fc310: Preparing
517d57875d1a: Preparing
e6b445379188: Preparing
b5dc947b0725: Preparing
d5e33691d531: Preparing
fa3ed5c81f79: Preparing
5bca6def87fa: Preparing
6d480301d481: Preparing
b4830f2cf7da: Preparing
659fa2f7ffc5: Preparing
2577fbe182b2: Preparing
8db2d63d863b: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
fa3ed5c81f79: Waiting
d00da3cd7763: Waiting
d402f4f1b906: Waiting
2577fbe182b2: Waiting
4e61e63529c2: Waiting
5bca6def87fa: Waiting
799760671c38: Waiting
8555e663f65b: Waiting
8db2d63d863b: Waiting
00ef5416d927: Waiting
6d480301d481: Waiting
3891808a925b: Waiting
659fa2f7ffc5: Waiting
d5e33691d531: Pushed
517d57875d1a: Pushed
e1d8cf3fc310: Pushed
fa3ed5c81f79: Pushed
b5dc947b0725: Pushed
b4830f2cf7da: Pushed
5bca6def87fa: Pushed
6d480301d481: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
2577fbe182b2: Pushed
8db2d63d863b: Pushed
659fa2f7ffc5: Pushed
e6b445379188: Pushed
20210927124331: digest: sha256:1caf78f9c5e9c3bd6a8ec4ad0e57addca65abd38d011c2ac3391d07786749ef8 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 27, 2021 12:45:46 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 27, 2021 12:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 27, 2021 12:45:47 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 27, 2021 12:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 27, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 27, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 1 seconds
Sep 27, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 27, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 73a482e146db5acdd4cc2ed93cdf9fe2b86e4a8e50954daf7940ca7c3a48befb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-c6SC4UbbWs3UzC7ZPN-f4rhuSo5QlU2veUDKfDpIvvs.pb
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 27, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 27, 2021 12:45:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 27, 2021 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 27, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-27_05_45_53-13018494378937361099?project=apache-beam-testing
Sep 27, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-27_05_45_53-13018494378937361099
Sep 27, 2021 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-27_05_45_53-13018494378937361099
Sep 27, 2021 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-27T12:46:01.285Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-mjpp. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 27, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:05.040Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:05.742Z: Expanding SplittableParDo operations into optimizable parts.
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:05.794Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:05.851Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:05.925Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:05.963Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.035Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.152Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.202Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.240Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.269Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.303Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.339Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.372Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.410Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.439Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.487Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.519Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.548Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.580Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.605Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.640Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.673Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.716Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.746Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.804Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.835Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.866Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.902Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:06.935Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 27, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:07.340Z: Starting 5 ****s in us-central1-a...
Sep 27, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:37.690Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 27, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:46:47.652Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 27, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:47:44.864Z: Workers have started successfully.
Sep 27, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T12:47:44.882Z: Workers have started successfully.
Sep 27, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T16:00:27.941Z: Cancel request is committed for workflow job: 2021-09-27_05_45_53-13018494378937361099.
Sep 27, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T16:00:28.895Z: Cleaning up.
Sep 27, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T16:00:28.971Z: Stopping **** pool...
Sep 27, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T16:00:29.048Z: Stopping **** pool...
Sep 27, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T16:02:51.112Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 27, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-27T16:02:51.156Z: Worker pool stopped.
Sep 27, 2021 4:03:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-27_05_45_53-13018494378937361099 finished with status CANCELLED.
Load test results for test (ID): 24917cd4-4ae7-4644-ba38-ccd18a03021b and timestamp: 2021-09-27T12:45:46.980000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11533.374
dataflow_v2_java11_total_bytes_count             2.72591798E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210927124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1caf78f9c5e9c3bd6a8ec4ad0e57addca65abd38d011c2ac3391d07786749ef8
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210927124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1caf78f9c5e9c3bd6a8ec4ad0e57addca65abd38d011c2ac3391d07786749ef8]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210927124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1caf78f9c5e9c3bd6a8ec4ad0e57addca65abd38d011c2ac3391d07786749ef8])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1caf78f9c5e9c3bd6a8ec4ad0e57addca65abd38d011c2ac3391d07786749ef8
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1caf78f9c5e9c3bd6a8ec4ad0e57addca65abd38d011c2ac3391d07786749ef8
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1caf78f9c5e9c3bd6a8ec4ad0e57addca65abd38d011c2ac3391d07786749ef8].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 45s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ciitnhbt3vi44

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #101

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/101/display/redirect>

Changes:


------------------------------------------
[...truncated 49.91 KB...]
63f8e7a34014: Pushed
7949805e1c86: Pushed
3a963485e68e: Pushed
a131c77e6716: Pushed
6222ed7176f7: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
631bf01a1485: Pushed
d776603a71a0: Pushed
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
9bbcdc967ab2: Pushed
0fda83d3a2e3: Pushed
20210926124332: digest: sha256:5678472199be1c1cf321f70f99b280886c12262960474a7c6e9194f86255e67b size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 26, 2021 12:45:13 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 26, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 26, 2021 12:45:14 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 26, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 26, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 26, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 26, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 26, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash d19829175c9682d54e17e61b9d3ace3fb8fa54f1df73a6663c6068e53199d8f3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0ZgpF1yWgtVOF-YbnTrOP7j6VPHfc6ZmPGBo5TGZ2PM.pb
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 26, 2021 12:45:19 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 26, 2021 12:45:19 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 26, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-26_05_45_19-3650379884586848115?project=apache-beam-testing
Sep 26, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-26_05_45_19-3650379884586848115
Sep 26, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-26_05_45_19-3650379884586848115
Sep 26, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-26T12:45:26.493Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-l6tm. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 26, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:30.570Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 26, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.366Z: Expanding SplittableParDo operations into optimizable parts.
Sep 26, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.395Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 26, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.452Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.529Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.558Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.613Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.721Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.753Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.786Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.812Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.844Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.867Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.900Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.933Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.975Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:31.999Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.032Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.064Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.111Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.145Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.181Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.211Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.244Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.267Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.291Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.323Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.356Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.377Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.409Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 26, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:32.680Z: Starting 5 ****s in us-central1-a...
Sep 26, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:45:55.156Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 26, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:46:17.944Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 26, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:47:13.687Z: Workers have started successfully.
Sep 26, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T12:47:13.717Z: Workers have started successfully.
Sep 26, 2021 3:48:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:48:33.609Z: Staged package classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar' is inaccessible.
Sep 26, 2021 3:48:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:48:33.974Z: Staged package flogger-0.6-MUVmnbYMex1jtOH3E36hpd5uqRxRamjoz90leU2Bt8Q.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/flogger-0.6-MUVmnbYMex1jtOH3E36hpd5uqRxRamjoz90leU2Bt8Q.jar' is inaccessible.
Sep 26, 2021 3:48:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:48:34.013Z: Staged package flogger-system-backend-0.6-r3-fBJx59nlB4eR2N7IGNFlHiDmKoKq83bx68K899bE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/flogger-system-backend-0.6-r3-fBJx59nlB4eR2N7IGNFlHiDmKoKq83bx68K899bE.jar' is inaccessible.
Sep 26, 2021 3:48:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:48:34.497Z: Staged package google-extensions-0.6-snj9TtDjAZ6b9UseOnnNS2R0N28B5AIdtdTYnu7zjO0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-extensions-0.6-snj9TtDjAZ6b9UseOnnNS2R0N28B5AIdtdTYnu7zjO0.jar' is inaccessible.
Sep 26, 2021 3:48:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:48:35.395Z: Staged package junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar' is inaccessible.
Sep 26, 2021 3:48:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:48:35.475Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Sep 26, 2021 3:48:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:48:35.554Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Sep 26, 2021 3:48:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-26T15:48:36.322Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 26, 2021 3:51:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-26T15:51:35.804Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 26, 2021 3:54:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:54:33.626Z: Staged package classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/classgraph-4.8.104-ePvfM6WdiJMBzk05ytbzmopew--T1rqGqw_PYcvi6h0.jar' is inaccessible.
Sep 26, 2021 3:54:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:54:33.985Z: Staged package flogger-0.6-MUVmnbYMex1jtOH3E36hpd5uqRxRamjoz90leU2Bt8Q.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/flogger-0.6-MUVmnbYMex1jtOH3E36hpd5uqRxRamjoz90leU2Bt8Q.jar' is inaccessible.
Sep 26, 2021 3:54:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:54:34.044Z: Staged package flogger-system-backend-0.6-r3-fBJx59nlB4eR2N7IGNFlHiDmKoKq83bx68K899bE.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/flogger-system-backend-0.6-r3-fBJx59nlB4eR2N7IGNFlHiDmKoKq83bx68K899bE.jar' is inaccessible.
Sep 26, 2021 3:54:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:54:34.513Z: Staged package google-extensions-0.6-snj9TtDjAZ6b9UseOnnNS2R0N28B5AIdtdTYnu7zjO0.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-extensions-0.6-snj9TtDjAZ6b9UseOnnNS2R0N28B5AIdtdTYnu7zjO0.jar' is inaccessible.
Sep 26, 2021 3:54:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:54:35.533Z: Staged package junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/junit-4.13.1-wwcZ25dNZFJ5P-GRs2OKV3cAVIW64UWSQERTD_pfYSI.jar' is inaccessible.
Sep 26, 2021 3:54:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:54:35.575Z: Staged package kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/kafka-clients-2.4.1-or4Vrm01S3aXE64MuzGE8iWqFvyYIFseskr9n8KjYNk.jar' is inaccessible.
Sep 26, 2021 3:54:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-26T15:54:35.652Z: Staged package lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/lz4-java-1.6.0-0ilUWqKx1SA8h2YUvbz_yswwNpf0-PJvdk4dbB7S5BY.jar' is inaccessible.
Sep 26, 2021 3:54:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-26T15:54:36.391Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 26, 2021 3:57:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-26T15:57:36.014Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 26, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T16:00:26.128Z: Cancel request is committed for workflow job: 2021-09-26_05_45_19-3650379884586848115.
Sep 26, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T16:00:26.153Z: Cleaning up.
Sep 26, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T16:00:26.224Z: Stopping **** pool...
Sep 26, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T16:00:26.300Z: Stopping **** pool...
Sep 26, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T16:02:53.030Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 26, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-26T16:02:53.060Z: Worker pool stopped.
Sep 26, 2021 4:02:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-26_05_45_19-3650379884586848115 finished with status CANCELLED.
Load test results for test (ID): f4198f48-e720-496d-807a-02271bf0782d and timestamp: 2021-09-26T12:45:14.541000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11558.366
dataflow_v2_java11_total_bytes_count             3.60383495E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210926124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5678472199be1c1cf321f70f99b280886c12262960474a7c6e9194f86255e67b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210926124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5678472199be1c1cf321f70f99b280886c12262960474a7c6e9194f86255e67b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210926124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5678472199be1c1cf321f70f99b280886c12262960474a7c6e9194f86255e67b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5678472199be1c1cf321f70f99b280886c12262960474a7c6e9194f86255e67b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5678472199be1c1cf321f70f99b280886c12262960474a7c6e9194f86255e67b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5678472199be1c1cf321f70f99b280886c12262960474a7c6e9194f86255e67b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 42s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q5tp57ab4fjti

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #100

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/100/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-12934] Use environment capabilities to determine length prefixing.

[Brian Hulette] Add missing comma

[ryanthompson591] Update tensorflow to version 2.6.0

[zhoufek] [BEAM-9487] Fix incorrected Repeatedly.may_finish implementation

[noreply] Cleanup use of futures. (#15043)

[noreply] Go Lint fix for wordcount and metrics (#15580)

[Brian Hulette] Fix whitespace lint

[noreply] [BEAM-3304] Snippets for trigger in BPG (#15409)

[noreply] [BEAM-12832] Add Go SDK xlang info to programming guide. (#15447)

[noreply] [BEAM-11097] Add SideInputCache to StateReader (#15563)

[Robert Bradshaw] Revert "Avoid apiary submission of job graph when it is not needed.

[noreply] [BEAM-12769] Java-emulating external transform. (#15546)

[noreply] [BEAM-12913] Pass query priority from ReadAllFromBigQuery (#15584)

[noreply] [BEAM-11982] Java Spanner - Implement IO Request Count metrics (#15493)


------------------------------------------
[...truncated 49.63 KB...]
d402f4f1b906: Waiting
d00da3cd7763: Waiting
00ef5416d927: Waiting
3891808a925b: Waiting
4e61e63529c2: Waiting
5c94d85de49e: Waiting
799760671c38: Waiting
cf75dda46606: Waiting
1d45e4518a0b: Waiting
b7076cd14284: Pushed
f387b53ed4eb: Pushed
076fe47c73d8: Pushed
cf75dda46606: Pushed
66bcc507711f: Pushed
6c5de256224e: Pushed
d088eb68bda8: Pushed
5c94d85de49e: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
f8ac61d12b49: Pushed
8555e663f65b: Layer already exists
4e61e63529c2: Layer already exists
d00da3cd7763: Layer already exists
799760671c38: Layer already exists
420c8f6b2459: Pushed
1d45e4518a0b: Pushed
7a245f5f2301: Pushed
20210925124344: digest: sha256:ef445dbd7be2158e45a67cf1daf163a663fe113fa17e184c7d3cf03010be22fc size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 25, 2021 12:46:00 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 25, 2021 12:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 25, 2021 12:46:01 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 25, 2021 12:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 25, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 25, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 2 seconds
Sep 25, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 25, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 52a52e31e31ed2e46ac26e343b59a3d8073cfc52469f05425b0844aeeb3e89ed> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-UqUuMeMe0uRqwm40O1mj2Ac8_FJGnwVCWwhErus-ie0.pb
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 25, 2021 12:46:08 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 25, 2021 12:46:08 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 25, 2021 12:46:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 25, 2021 12:46:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-25_05_46_08-17003214306424250645?project=apache-beam-testing
Sep 25, 2021 12:46:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-25_05_46_08-17003214306424250645
Sep 25, 2021 12:46:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-25_05_46_08-17003214306424250645
Sep 25, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T12:46:14.497Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-eb79. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:17.677Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.473Z: Expanding SplittableParDo operations into optimizable parts.
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.508Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.588Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.654Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.680Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.741Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.838Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.872Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.905Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.928Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.950Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:18.982Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.016Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.038Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.065Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.099Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.124Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.158Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.191Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.238Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.265Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.309Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.336Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.372Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.405Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.438Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.474Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.506Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 25, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:19.539Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 25, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:20.078Z: Starting 5 ****s in us-central1-a...
Sep 25, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:46:26.490Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 25, 2021 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:47:00.317Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 25, 2021 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:47:52.343Z: Workers have started successfully.
Sep 25, 2021 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T12:47:52.380Z: Workers have started successfully.
Sep 25, 2021 1:43:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-25T13:43:23.702Z: Staged package snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar' is inaccessible.
Sep 25, 2021 1:43:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T13:43:23.925Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 1:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T13:46:22.533Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 1:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-25T13:49:23.669Z: Staged package snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar' is inaccessible.
Sep 25, 2021 1:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T13:49:23.812Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 1:52:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T13:52:23.509Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 1:55:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-25T13:55:22.733Z: Staged package snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar' is inaccessible.
Sep 25, 2021 1:55:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T13:55:22.899Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 1:58:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T13:58:23.628Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 2:01:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-25T14:01:23.495Z: Staged package snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/snakeyaml-1.27-fnzOZ0DtcFv9-qx7RCwTddKYbS8pNZNqW9QMFOGP1zY.jar' is inaccessible.
Sep 25, 2021 2:01:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T14:01:23.651Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 2:04:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-25T14:04:23.675Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 25, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T16:00:24.378Z: Cancel request is committed for workflow job: 2021-09-25_05_46_08-17003214306424250645.
Sep 25, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T16:00:24.425Z: Cleaning up.
Sep 25, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T16:00:24.515Z: Stopping **** pool...
Sep 25, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T16:00:24.581Z: Stopping **** pool...
Sep 25, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T16:02:53.480Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 25, 2021 4:02:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-25T16:02:53.523Z: Worker pool stopped.
Sep 25, 2021 4:02:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-25_05_46_08-17003214306424250645 finished with status CANCELLED.
Load test results for test (ID): a64df3c6-23e3-4c70-bc22-3a5cfaa7c273 and timestamp: 2021-09-25T12:46:00.871000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11513.154
dataflow_v2_java11_total_bytes_count             3.40228439E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210925124344
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef445dbd7be2158e45a67cf1daf163a663fe113fa17e184c7d3cf03010be22fc
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210925124344]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef445dbd7be2158e45a67cf1daf163a663fe113fa17e184c7d3cf03010be22fc]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210925124344] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef445dbd7be2158e45a67cf1daf163a663fe113fa17e184c7d3cf03010be22fc])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef445dbd7be2158e45a67cf1daf163a663fe113fa17e184c7d3cf03010be22fc
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef445dbd7be2158e45a67cf1daf163a663fe113fa17e184c7d3cf03010be22fc
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ef445dbd7be2158e45a67cf1daf163a663fe113fa17e184c7d3cf03010be22fc].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 36s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jzofc55bpp5dk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #99

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/99/display/redirect?page=changes>

Changes:

[noreply] Avoid setting empty builders in proto setters

[Brian Hulette] Add back 'more' break

[Robert Bradshaw] Fix some website logos.

[chuck.yang] Run BigQuery queries with batch priority

[chuck.yang] Allow changing query priority in ReadFromBigQuery

[chuck.yang] Update CHANGES

[kawaigin] Implicitly watch and track anonymous pipeline and PCollections

[chamikaramj] Reject requests when parameter names cannot be validated unless

[chamikaramj] Updates error message

[alexander.chermenin] [BEAM-10822] Fixed typo in BigqueryClient

[zyichi] [BEAM-12898] Minor fix to Kubernetes.groovy postBuildScript.

[noreply] [BEAM-12869] Bump tensorflow from 2.5.0 to 2.5.1 in

[noreply] [BEAM-12946] Fix SmallestPerKey, add unit testing (#15567)

[zyichi] [BEAM-12908] Sickbay pubsublite.ReadWriteIT

[noreply] Relocate Go SDK breaking change note out of template into 2.33.0.


------------------------------------------
[...truncated 48.54 KB...]
46ef6de51396: Preparing
0d891c597dbc: Preparing
b1f4c39338d2: Preparing
50dcb9edf7b8: Preparing
874a0a098966: Preparing
4ef5a9a214d8: Preparing
9e1a401c9ccd: Preparing
f6d860b1fb67: Preparing
181a53937f73: Preparing
894ae42d6f88: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
50dcb9edf7b8: Waiting
874a0a098966: Waiting
3891808a925b: Waiting
d402f4f1b906: Waiting
4ef5a9a214d8: Waiting
00ef5416d927: Waiting
9e1a401c9ccd: Waiting
894ae42d6f88: Waiting
8555e663f65b: Waiting
4e61e63529c2: Waiting
f6d860b1fb67: Waiting
799760671c38: Waiting
d00da3cd7763: Waiting
181a53937f73: Waiting
b1f4c39338d2: Pushed
d290b24e70a8: Pushed
46ef6de51396: Pushed
1cb465b10024: Pushed
50dcb9edf7b8: Pushed
0d891c597dbc: Pushed
4ef5a9a214d8: Pushed
181a53937f73: Pushed
9e1a401c9ccd: Pushed
3891808a925b: Layer already exists
874a0a098966: Pushed
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
894ae42d6f88: Pushed
f6d860b1fb67: Pushed
20210924124334: digest: sha256:d32e0c80f8218aac64650bb53a88e7ec9ab5ff990a30aed32e4276b2fb6c7829 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 24, 2021 12:46:10 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 24, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 24, 2021 12:46:11 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 24, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 24, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 24, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 24, 2021 12:46:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 24, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 9a74309829b03d9368f9bcd5e5fbd7c4d3815d4739eb9e6f8373f288760ac35a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mnQwmCmwPZNo-bzV5fvXxNOBXUc5655vg3PyiHYKw1o.pb
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 24, 2021 12:46:16 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@aaa0f76]
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 24, 2021 12:46:16 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34b9eb03]
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 24, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 24, 2021 12:46:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-24_05_46_17-15053782412180795083?project=apache-beam-testing
Sep 24, 2021 12:46:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-24_05_46_17-15053782412180795083
Sep 24, 2021 12:46:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-24_05_46_17-15053782412180795083
Sep 24, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-24T12:46:23.048Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-jvhc. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 24, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:26.739Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.444Z: Expanding SplittableParDo operations into optimizable parts.
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.481Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.566Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.651Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.691Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.765Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.889Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.920Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.957Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:27.994Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.035Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.063Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.095Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.135Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.162Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.213Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.263Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.299Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.340Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.382Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.416Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.437Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.464Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.504Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.537Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.580Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.604Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.631Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:28.664Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 24, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:29.069Z: Starting 5 ****s in us-central1-a...
Sep 24, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:46:45.307Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 24, 2021 12:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:47:14.742Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 24, 2021 12:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:48:11.484Z: Workers have started successfully.
Sep 24, 2021 12:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T12:48:11.505Z: Workers have started successfully.
Sep 24, 2021 3:03:53 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
Sep 24, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T16:00:25.712Z: Cancel request is committed for workflow job: 2021-09-24_05_46_17-15053782412180795083.
Sep 24, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T16:00:25.820Z: Cleaning up.
Sep 24, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T16:00:25.903Z: Stopping **** pool...
Sep 24, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T16:00:25.956Z: Stopping **** pool...
Sep 24, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T16:02:50.168Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 24, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-24T16:02:50.217Z: Worker pool stopped.
Sep 24, 2021 4:02:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-24_05_46_17-15053782412180795083 finished with status CANCELLED.
Load test results for test (ID): 9a89dc7e-2ad0-493e-8495-f10da3967164 and timestamp: 2021-09-24T12:46:11.349000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11499.376
dataflow_v2_java11_total_bytes_count             2.45097132E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210924124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d32e0c80f8218aac64650bb53a88e7ec9ab5ff990a30aed32e4276b2fb6c7829
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210924124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d32e0c80f8218aac64650bb53a88e7ec9ab5ff990a30aed32e4276b2fb6c7829]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210924124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d32e0c80f8218aac64650bb53a88e7ec9ab5ff990a30aed32e4276b2fb6c7829])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d32e0c80f8218aac64650bb53a88e7ec9ab5ff990a30aed32e4276b2fb6c7829
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d32e0c80f8218aac64650bb53a88e7ec9ab5ff990a30aed32e4276b2fb6c7829
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d32e0c80f8218aac64650bb53a88e7ec9ab5ff990a30aed32e4276b2fb6c7829].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 38s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/z4irkwzm6pp5a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #98

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/98/display/redirect?page=changes>

Changes:

[kileysok] Add MapState and SetState support

[Kyle Weaver] [BEAM-12898] Use new postbuildscript dsl in Flink load tests

[Kyle Weaver] run postbuild regardless of test result

[Kyle Weaver] spotless

[noreply] Merge pull request #15487 from [BEAM-12812] - Run Github Actions on GCP

[noreply] [BEAM-11097] Add SideInputCache to harness control type (#15530)

[kawaigin] Updated interactive integration test golden screenshots.

[noreply] Revert PR 15487 (BEAM-12812) (#15554)

[zyichi] [BEAM-12898] Trying out solution suggestion for JENKINS-66189 to solve

[noreply] [BEAM-12383] Adding Go SDK and Kafka IO to Gradle cross-language test


------------------------------------------
[...truncated 49.57 KB...]
77ce3c40d526: Preparing
747a4eef8d96: Preparing
05333715e415: Preparing
f3117e044d65: Preparing
199149875d8f: Preparing
f573d43a7df5: Preparing
6d9f8408d1d8: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
f573d43a7df5: Waiting
05333715e415: Waiting
d402f4f1b906: Waiting
f3117e044d65: Waiting
6d9f8408d1d8: Waiting
199149875d8f: Waiting
00ef5416d927: Waiting
3891808a925b: Waiting
8555e663f65b: Waiting
77ce3c40d526: Waiting
799760671c38: Waiting
66d0e622a342: Pushed
40c1063b98f7: Pushed
8b4798633287: Pushed
563013fdf011: Pushed
77ce3c40d526: Pushed
542d82ea7219: Pushed
05333715e415: Pushed
f3117e044d65: Pushed
f573d43a7df5: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
799760671c38: Layer already exists
4e61e63529c2: Layer already exists
747a4eef8d96: Pushed
6d9f8408d1d8: Pushed
199149875d8f: Pushed
20210923124330: digest: sha256:b825d3afd98b306411674734fe358a4dcb9108480a7dd506e1b2cf5cc1204c9b size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 23, 2021 12:45:29 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 23, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 23, 2021 12:45:30 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 23, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 23, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 23, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 23, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 23, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 98cb4a294d1cd10162b45f0d703e1e60c1959aaaa08d532c62d5a2136630e364> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mMtKKU0c0QFitF8NcD4eYMGVmqqgjVMsYtWiE2Yw42Q.pb
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 23, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@298f0a0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b960a7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31dfc6f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37b52340, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@663bb8ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4e40d7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60e9c3a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e5843db, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459f703f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@188ac8a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3650d4fc]
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 23, 2021 12:45:35 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24841372, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77114efe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79a7bfbc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77f68df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e4e4c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e7f3cfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ae126d1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a488c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6242ae3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65ddee5a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56399b9e]
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 23, 2021 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 23, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-23_05_45_35-8041644847459113944?project=apache-beam-testing
Sep 23, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-23_05_45_35-8041644847459113944
Sep 23, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-23_05_45_35-8041644847459113944
Sep 23, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-23T12:45:42.329Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-9zc7. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:47.196Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:47.872Z: Expanding SplittableParDo operations into optimizable parts.
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:47.908Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:47.969Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.044Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.078Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.145Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.262Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.286Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.317Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.343Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 23, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.425Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.458Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.496Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.527Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.551Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.584Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.638Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.680Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.708Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.744Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.770Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.800Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.821Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.853Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.877Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.899Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.921Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 23, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.954Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 23, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:48.978Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 23, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:45:49.352Z: Starting 5 ****s in us-central1-a...
Sep 23, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:46:17.031Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 23, 2021 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:46:19.446Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 23, 2021 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:46:19.478Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Sep 23, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:46:29.806Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 23, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:47:24.223Z: Workers have started successfully.
Sep 23, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T12:47:24.264Z: Workers have started successfully.
Sep 23, 2021 2:03:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-23T14:03:52.651Z: Staged package jackson-dataformat-cbor-2.12.4-xpFy2QEl4dUh55AUzgOueyVcIoNGuAdPK6G3l4P-yCc.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-dataformat-cbor-2.12.4-xpFy2QEl4dUh55AUzgOueyVcIoNGuAdPK6G3l4P-yCc.jar' is inaccessible.
Sep 23, 2021 2:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-23T14:03:54.143Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 23, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T16:00:33.259Z: Cancel request is committed for workflow job: 2021-09-23_05_45_35-8041644847459113944.
Sep 23, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T16:00:33.315Z: Cleaning up.
Sep 23, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T16:00:33.399Z: Stopping **** pool...
Sep 23, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T16:00:33.462Z: Stopping **** pool...
Sep 23, 2021 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T16:02:53.617Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 23, 2021 4:02:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-23T16:02:53.654Z: Worker pool stopped.
Sep 23, 2021 4:03:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-23_05_45_35-8041644847459113944 finished with status CANCELLED.
Load test results for test (ID): 0f75d1c2-d44a-4145-87a0-b2cea919cdc1 and timestamp: 2021-09-23T12:45:30.189000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11524.165
dataflow_v2_java11_total_bytes_count             2.93525952E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210923124330
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b825d3afd98b306411674734fe358a4dcb9108480a7dd506e1b2cf5cc1204c9b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210923124330]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b825d3afd98b306411674734fe358a4dcb9108480a7dd506e1b2cf5cc1204c9b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210923124330] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b825d3afd98b306411674734fe358a4dcb9108480a7dd506e1b2cf5cc1204c9b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b825d3afd98b306411674734fe358a4dcb9108480a7dd506e1b2cf5cc1204c9b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b825d3afd98b306411674734fe358a4dcb9108480a7dd506e1b2cf5cc1204c9b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b825d3afd98b306411674734fe358a4dcb9108480a7dd506e1b2cf5cc1204c9b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 52s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kolxg2tci2sn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #97

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/97/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12919] Removed IBM Streams from runner matrix (#15542)

[danthev] Fix 2.32.0 release notes.

[noreply] [BEAM-12258] Re-throw exception from forked thread in

[kawaigin] [BEAM-10708] Added an example notebook for beam_sql magic

[noreply] Add a timeout for BQ streaming_insert RPCS (#15541)

[noreply] Merge pull request #15537 from [BEAM-12908] Add a sleep to the IT after


------------------------------------------
[...truncated 49.31 KB...]
b0c95b2954a3: Pushed
75862a2f912c: Pushed
3c77729b931b: Pushed
e0858b5565a7: Pushed
c23774ca1987: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
b63bcb3be20b: Pushed
799760671c38: Layer already exists
5d294ac669be: Pushed
c036ede0a712: Pushed
24d403efe7b4: Pushed
20210922124335: digest: sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 22, 2021 12:45:33 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 22, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 22, 2021 12:45:34 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash fbdd9ba961cf8321bc47225252568573fc4a4cb0e7fb71b2ccd623e964808a11> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--92bqWHPgyG8RyJSUlaFc_xKTLDn-3GyzNYj6WSAihE.pb
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 22, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef]
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 22, 2021 12:45:38 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713]
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 22, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-22_05_45_38-8630686692093343177?project=apache-beam-testing
Sep 22, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-22_05_45_38-8630686692093343177
Sep 22, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-22_05_45_38-8630686692093343177
Sep 22, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-22T12:45:46.517Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-i48w. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 22, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:50.707Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.572Z: Expanding SplittableParDo operations into optimizable parts.
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.608Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.673Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.753Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.793Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.861Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.956Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.989Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.026Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.061Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.091Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.124Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.156Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.189Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.226Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.262Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.295Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.326Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.354Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.392Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.429Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.464Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.498Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.535Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.557Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.592Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.623Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.646Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.677Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 22, 2021 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:53.039Z: Starting 5 ****s in us-central1-a...
Sep 22, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:03.523Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 22, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:22.700Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 22, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:22.729Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
Sep 22, 2021 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:32.989Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 22, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:47:29.757Z: Workers have started successfully.
Sep 22, 2021 12:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:47:29.788Z: Workers have started successfully.
Sep 22, 2021 2:09:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-22T14:09:53.191Z: Staged package amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar' is inaccessible.
Sep 22, 2021 2:09:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-22T14:09:57.035Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 22, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.380Z: Cancel request is committed for workflow job: 2021-09-22_05_45_38-8630686692093343177.
Sep 22, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.416Z: Cleaning up.
Sep 22, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.491Z: Stopping **** pool...
Sep 22, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.565Z: Stopping **** pool...
Sep 22, 2021 4:02:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:02:44.163Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 22, 2021 4:02:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:02:44.201Z: Worker pool stopped.
Sep 22, 2021 4:02:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-22_05_45_38-8630686692093343177 finished with status CANCELLED.
Load test results for test (ID): 7e9be1e1-4f39-4908-9b1d-ea8004aa235b and timestamp: 2021-09-22T12:45:33.904000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11535.738
dataflow_v2_java11_total_bytes_count             2.92455941E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210922124335
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a
Deleted: sha256:13456361d1b2fbc317b58dedd4c54c27ac5f742edfd3eaccc5528e9ac6160e94
Deleted: sha256:b3bf7ec2f7552e27427a868c7b901bed80a7bb312c9a68532f488a9da140f4b9
Deleted: sha256:76d19613a00da22afdd6e5ad3516ce02e0d388dc69fd7795c0a5cdc63bc52f75
Deleted: sha256:70181a744720f29fd5615ec56fce13bcf27a967405a0483ad3df1afc59df016f
Deleted: sha256:cf1acda140f19415c74687bb2abba4eea28fbb9a8c91b1c5854c6bb8f13e8850
Deleted: sha256:687d5a5d634adc80adb4a9701440a2954bf8dd4f4c691e17030fe0119ea79ff6
Deleted: sha256:ee219303f9b8e622d62b7a53aa16cf9bc8b25e2bdbe5ee38d272b32b488de3af
Deleted: sha256:f7787378d4113a51c42376eb1a1b368d2400dbac8537c078e881c520bbb0df79
Deleted: sha256:c5263b87bad64aa125a9ca3cc37fed078f7fa58066dc96056bc87cf5a449739a
Deleted: sha256:7dff09d37c5974b025478c8995f588bb466b8242785edbcd5138236ff37e9b9e
Deleted: sha256:a55477630e507e10750021b0e73ed7ffa3c7fa74cef86c423723c4eb74854aba
Deleted: sha256:1fecfd038bcaf4276e8ede41605ed687b653e06820a6883d224d5f2d12c43fcd
Deleted: sha256:7669ab5ce7d9ace61d245839652dab2eedc4d767598ee2d1e95a8dd76c24f701
Deleted: sha256:5c48581ce374529e720e84ffa0c20919f66f571df54cde3277ce2a377e37c03f
Deleted: sha256:c09a7fa64d597b371f04e4365e8dfe7ebd8d2f17bbf699d9f0ba3eea6c1db9ed
Deleted: sha256:d3c0aa9f2368fbafb5b1110731164576329b49c98fa93466b68b0f8663e3cb92
Deleted: sha256:784cc9603582840701620872deac520f54d9ac58dc863137bbdaf8211c45e988
Deleted: sha256:8d85aa290abd9dd675feec91a8e7b836080b4fd3be512d212156b5616c8b083d
Deleted: sha256:7cb2d62743b08bc8c22a0a5998280efab6cb0e9b4e2b1a9a8940c7d50367e898
Deleted: sha256:d3fcd50dfe22262d70cd51149d5cb9ef7f7dcf5ffdf100d855ba5e4a15415401
Deleted: sha256:6bfd67221536ed73ac3b1948d682ab8287a85662e4b2431b22a6906d4f149409
Deleted: sha256:810a2f68192eee2fbdd5a32f6cb13b6e47dbffaf3a80913bcd23c6ffcf15d620
Deleted: sha256:02e087a3cd8fff999a51746db2e1fd1a4ba9bac8b53e813a9b97b25c598f89a1
Deleted: sha256:2b18d073580d2e89148ce234a7b49f779c134587f8dfe49b4c4ad0a499113fd2
Deleted: sha256:d2f785efc30adaf5d0ebf3cd41162fd88c1b78f5f6884d11e6741536061c6e6b
Deleted: sha256:7df056af9fdca24107e272aec2cbbc8d5b930657364c10b8460dee67c37fb09a
Deleted: sha256:b6c63cd83a11dd4c8aa3449be982e3f3acfe3ad3acfb16a564100063e5ef967c
Deleted: sha256:76a645bf1651ec0c5e745edc178a4fa533b2fdfe4ff2e109c4d37336b898d60d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210922124335]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210922124335] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 33s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/r4w6jff3bwbha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #96

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/96/display/redirect?page=changes>

Changes:

[chamikaramj] Python support for directly using Java transforms using constructor and

[chamikaramj] Fixes yapf

[chamikaramj] Fixes lint

[chamikaramj] Addressing reviewer comments

[chamikaramj] Adds support for a field name format that will be ignored at expansion

[danthev] Fix flaky test.

[danthev] Fix lint errors.

[Robert Bradshaw] Additional CoGBK tests.

[chamikaramj] Addresses reviewer comments

[chamikaramj] Use correct ignore field prefix in Python side

[noreply] [BEAM-12803] Update deprecated use of _field_types (#15539)

[Robert Bradshaw] Move CoGBK tests into appropreate module.


------------------------------------------
[...truncated 49.75 KB...]
d00da3cd7763: Waiting
799760671c38: Waiting
d402f4f1b906: Waiting
8555e663f65b: Waiting
00ef5416d927: Waiting
28e102d0ceaa: Pushed
3e9dfe7530d6: Pushed
07e78b1349a3: Pushed
3de5088183ac: Pushed
6ad30953503c: Pushed
687490d53da3: Pushed
672268a545f2: Pushed
2e01ff9ddad4: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
67cb3d00a566: Pushed
d00da3cd7763: Layer already exists
7e9d4d56b4da: Pushed
799760671c38: Layer already exists
4e61e63529c2: Layer already exists
0d184f2ec06c: Pushed
942298825dbf: Pushed
20210921124331: digest: sha256:ffc8a8a8aefa672ed766bb5c4b710d35a7fe8967874ee70d2fc271cc040cd37b size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 21, 2021 12:45:25 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 21, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 21, 2021 12:45:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 21, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 21, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 21, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 21, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 21, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash c60db03395aede5fe28e6856a620d116dd20385ccaa0bb8a3b0165dd4eeaa0dd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xg2wM5Wu3l_ijmhWpiDRFt0gOFzKoLuKOwFl3U7qoN0.pb
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 21, 2021 12:45:31 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376]
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 21, 2021 12:45:31 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67]
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 21, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-21_05_45_31-8373750593306316118?project=apache-beam-testing
Sep 21, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-21_05_45_31-8373750593306316118
Sep 21, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-21_05_45_31-8373750593306316118
Sep 21, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-21T12:45:39.445Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-b8wl. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 21, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:42.785Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 21, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.591Z: Expanding SplittableParDo operations into optimizable parts.
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.630Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.698Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.770Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.790Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.855Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.950Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:43.985Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.022Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.046Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.079Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.104Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.130Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.162Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.192Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.222Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.247Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.277Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.304Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.339Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.366Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.395Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.427Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.463Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.509Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.544Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.571Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.607Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 21, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.639Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 21, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:45:44.961Z: Starting 5 ****s in us-central1-a...
Sep 21, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:46:12.810Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 21, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:46:28.732Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 21, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:47:21.847Z: Workers have started successfully.
Sep 21, 2021 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T12:47:21.867Z: Workers have started successfully.
Sep 21, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T16:00:25.219Z: Cancel request is committed for workflow job: 2021-09-21_05_45_31-8373750593306316118.
Sep 21, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T16:00:25.285Z: Cleaning up.
Sep 21, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T16:00:25.353Z: Stopping **** pool...
Sep 21, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T16:00:25.408Z: Stopping **** pool...
Sep 21, 2021 4:02:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T16:02:36.748Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 21, 2021 4:02:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-21T16:02:36.791Z: Worker pool stopped.
Sep 21, 2021 4:02:42 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-21_05_45_31-8373750593306316118 finished with status CANCELLED.
Load test results for test (ID): 32adc8e4-ca77-4235-b316-49aaa60d6e99 and timestamp: 2021-09-21T12:45:26.590000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11510.835
dataflow_v2_java11_total_bytes_count             3.09313067E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921124331
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ffc8a8a8aefa672ed766bb5c4b710d35a7fe8967874ee70d2fc271cc040cd37b
Deleted: sha256:270f1b7a27d8ac27fdda11fbbc47fd07d780aebd36f2c0c58f2eaaeb9dcdaf21
Deleted: sha256:534d819b73beae359564314ff030a03872f7814adcfa1b666bbd5c464d36ce63
Deleted: sha256:43e1ce4c2c4f9008eed01c1e3d8faf1fbe0c1cd42707bd3dadf12d52223cec69
Deleted: sha256:84e9e5fdf1043c63df15e16972ece1b88373e1900e8870d0e793162cacc2c20e
Deleted: sha256:1320a62222ca1c52d2fe006875c5c959916c0d14e2d3711ce24377d4f594d7d9
Deleted: sha256:bb4ff5aafab5d16c3d35d2dec6448fc70fc57a90414e18eb20e2283f63a14f9f
Deleted: sha256:dcaaecd16612d6ae3513275f1fe66c3937d882a90eb2861e764579795fb5f088
Deleted: sha256:54db48fd4f9e018fd2452f91db22b0dc8b11dca49d1f033dba154e4540c4a588
Deleted: sha256:efc2a1c31c6dbf22bd044972882e7372591e448c83b5007b22860ee989a0f254
Deleted: sha256:fa04eef5b0fecb0e5e8352a89816bf8a7c9fffe94b093c5067d6ef8a7df82a2a
Deleted: sha256:c6e08c09dd20b92186d8adf36b1d8e89ca32d43af792ba84eb99e14ff691688d
Deleted: sha256:4e800fd313af7de18e10d15b5ac9736998b3e362390a29b20e5938f02b7a2de2
Deleted: sha256:5212a5655d31994c625c107e4ea797d9842cd1f7832bcd8935810a26fd5bfff5
Deleted: sha256:2ceac9e2666c29519511fc74012982552cf2bc68c2b8d576dfa9a3a3f322a1b5
Deleted: sha256:7aa5c32295ff420e2e09c04fa36cb5559d00c0a30623d6a7cc76ff3f71cfb844
Deleted: sha256:693ada4ee2d2fe82b3a254b4cce208fba427fa42352073f45e03d634b65cb31e
Deleted: sha256:386abc8e4b61e5359e1ac2b87e9e9b6ab6140b8ff07a321105c38c756fd89ab5
Deleted: sha256:dfeadcbdab5b0db39aaf5dac91372694695d72049a59dfb35b7a7075ed97fcb6
Deleted: sha256:204d18954b11ae2fa00ed668bf1fa93ce25d796212d643fc8b2cdf25a32da4b1
Deleted: sha256:9d3f3e63d204e036f091a9b9d1926686dc851d1cb7d9d49212b5d07625a5fb01
Deleted: sha256:5298e21d346968f4ff6c5b3bcc86a8ac6b5de28523ebbc4202f416831422f9ef
Deleted: sha256:986539b8c15b327088dfd1bac55510902b263aab506775fea2f967e6b5a49b5d
Deleted: sha256:0b14c60920782f91fdf6c736f1cf9742fd420229828f37069ed556feed123da9
Deleted: sha256:d40f7de54e87e43af8a054445849f8db6f918139dc142d217444366bf95240c5
Deleted: sha256:fb925b67157f4c8d8eb731a067e640352e219e47677ea1d94b79005e8ed4a1fb
Deleted: sha256:06cccf38bc7b4181acf5bf1c5d82b53b1ad0b0e9c526a82f9ae885a13c61f7bc
Deleted: sha256:1147b300d1a0b592dfc6a9ee575ab372062cb6d23d15ebd33991ebca443a9527
Deleted: sha256:9e86ced2a9955da60a7b13f12f09c46cbf5163c847d78e56812723992cb4eace
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921124331]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ffc8a8a8aefa672ed766bb5c4b710d35a7fe8967874ee70d2fc271cc040cd37b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ffc8a8a8aefa672ed766bb5c4b710d35a7fe8967874ee70d2fc271cc040cd37b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ffc8a8a8aefa672ed766bb5c4b710d35a7fe8967874ee70d2fc271cc040cd37b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ffc8a8a8aefa672ed766bb5c4b710d35a7fe8967874ee70d2fc271cc040cd37b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ffc8a8a8aefa672ed766bb5c4b710d35a7fe8967874ee70d2fc271cc040cd37b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 28s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6bjhdfldeej3k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #95

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/95/display/redirect>

Changes:


------------------------------------------
[...truncated 49.22 KB...]
94ab71098673: Preparing
953732d98cbd: Preparing
b1c428ea8ea4: Preparing
4708d6cf1204: Preparing
554f5e548896: Preparing
d0f03a3418d6: Preparing
d80ccc5dfad1: Preparing
5c807eafef2f: Preparing
08c7dbaefe9c: Preparing
4708d6cf1204: Waiting
9009985da905: Preparing
3891808a925b: Preparing
554f5e548896: Waiting
d402f4f1b906: Preparing
00ef5416d927: Preparing
d0f03a3418d6: Waiting
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
d80ccc5dfad1: Waiting
5c807eafef2f: Waiting
8555e663f65b: Waiting
08c7dbaefe9c: Waiting
d00da3cd7763: Waiting
4e61e63529c2: Waiting
9009985da905: Waiting
799760671c38: Waiting
3891808a925b: Waiting
00ef5416d927: Waiting
d402f4f1b906: Waiting
94ab71098673: Pushed
b1c428ea8ea4: Pushed
35a5fcdac39c: Pushed
3ec30b57f4d5: Pushed
953732d98cbd: Pushed
4708d6cf1204: Pushed
d0f03a3418d6: Pushed
d80ccc5dfad1: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
08c7dbaefe9c: Pushed
8555e663f65b: Layer already exists
9009985da905: Pushed
d00da3cd7763: Layer already exists
554f5e548896: Pushed
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
5c807eafef2f: Pushed
20210920124338: digest: sha256:68163f05f1c3e4be7d2e732835240ea00b3091c5cd7faabd8f2ccf367364f635 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 20, 2021 12:47:41 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 20, 2021 12:47:41 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 20, 2021 12:47:42 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 20, 2021 12:47:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 20, 2021 12:47:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 20, 2021 12:47:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading /home/jenkins/.m2/repository/org/apache/beam/beam-vendor-grpc-1_36_0/0.2/beam-vendor-grpc-1_36_0-0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-nVxp3_1Xbd1ktCTMOtQxL1xQUUS_NwN-RSOhZ5xBwdg.jar
Sep 20, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 194 files cached, 1 files newly uploaded in 1 seconds
Sep 20, 2021 12:47:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 20, 2021 12:47:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 210c80cec4e0c92a301a821c838a61e79b3ddff3b72bf14212648f1f95df4ed7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IQyAzsTgySowGoIcg4ph55s93_O3K_FCEmSPH5XfTtc.pb
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 20, 2021 12:47:48 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef]
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 20, 2021 12:47:48 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713]
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 20, 2021 12:47:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 20, 2021 12:47:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-20_05_47_48-1978144490141291391?project=apache-beam-testing
Sep 20, 2021 12:47:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-20_05_47_48-1978144490141291391
Sep 20, 2021 12:47:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-20_05_47_48-1978144490141291391
Sep 20, 2021 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-20T12:47:56.403Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-8c8o. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 20, 2021 12:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:00.671Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.229Z: Expanding SplittableParDo operations into optimizable parts.
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.257Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.313Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.384Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.406Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.500Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.612Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.649Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.680Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.715Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.736Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.763Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.786Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.806Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.839Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.875Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.908Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.941Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.964Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:01.996Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.029Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.054Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.089Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.122Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.145Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.171Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.196Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.228Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.266Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 20, 2021 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:02.596Z: Starting 5 ****s in us-central1-a...
Sep 20, 2021 12:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:30.435Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 20, 2021 12:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:48:51.527Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 20, 2021 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:49:47.212Z: Workers have started successfully.
Sep 20, 2021 12:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T12:49:47.238Z: Workers have started successfully.
Sep 20, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T16:00:24.695Z: Cancel request is committed for workflow job: 2021-09-20_05_47_48-1978144490141291391.
Sep 20, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T16:00:24.750Z: Cleaning up.
Sep 20, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T16:00:24.812Z: Stopping **** pool...
Sep 20, 2021 4:00:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T16:00:24.855Z: Stopping **** pool...
Sep 20, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T16:02:49.553Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 20, 2021 4:02:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-20T16:02:49.585Z: Worker pool stopped.
Sep 20, 2021 4:02:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-20_05_47_48-1978144490141291391 finished with status CANCELLED.
Load test results for test (ID): ec7d27d1-5b7e-4517-a466-729f58a15f14 and timestamp: 2021-09-20T12:47:42.248000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11363.641
dataflow_v2_java11_total_bytes_count             2.05473991E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210920124338
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:68163f05f1c3e4be7d2e732835240ea00b3091c5cd7faabd8f2ccf367364f635
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210920124338]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:68163f05f1c3e4be7d2e732835240ea00b3091c5cd7faabd8f2ccf367364f635]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210920124338] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:68163f05f1c3e4be7d2e732835240ea00b3091c5cd7faabd8f2ccf367364f635])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:68163f05f1c3e4be7d2e732835240ea00b3091c5cd7faabd8f2ccf367364f635
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:68163f05f1c3e4be7d2e732835240ea00b3091c5cd7faabd8f2ccf367364f635
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:68163f05f1c3e4be7d2e732835240ea00b3091c5cd7faabd8f2ccf367364f635].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 37s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sg7hblbv7vz22

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #94

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/94/display/redirect>

Changes:


------------------------------------------
[...truncated 49.02 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
b69cf6716aaf: Preparing
cff32e42fb6f: Preparing
7f9400f001db: Preparing
80da27462470: Preparing
e8b33b763bd6: Preparing
9c0973a5e643: Preparing
5f7e05b91381: Preparing
a0fbc6d46abd: Preparing
eddc235e5e23: Preparing
646a73e325fe: Preparing
51ed3827f313: Preparing
dfd16bb7186a: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
eddc235e5e23: Waiting
9c0973a5e643: Waiting
646a73e325fe: Waiting
5f7e05b91381: Waiting
51ed3827f313: Waiting
dfd16bb7186a: Waiting
a0fbc6d46abd: Waiting
00ef5416d927: Waiting
d00da3cd7763: Waiting
4e61e63529c2: Waiting
3891808a925b: Waiting
d402f4f1b906: Waiting
8555e663f65b: Waiting
e8b33b763bd6: Pushed
cff32e42fb6f: Pushed
7f9400f001db: Pushed
b69cf6716aaf: Pushed
80da27462470: Pushed
9c0973a5e643: Pushed
a0fbc6d46abd: Pushed
eddc235e5e23: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
5f7e05b91381: Pushed
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
51ed3827f313: Pushed
dfd16bb7186a: Pushed
646a73e325fe: Pushed
20210919124332: digest: sha256:ed3fc659aa9c33a09f43434d8433a1041be188d58ffcbd0494f2307d5939469b size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 19, 2021 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 19, 2021 12:45:44 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 19, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 19, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 19, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 19, 2021 12:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 19, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash b91d021c56b3a79b73f5e9124f299638b187d279b5ec001ed924414b96b00b41> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uR0CHFazp5tz9ekSTymWOLGH0nm17AAe2SRBS5awC0E.pb
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 19, 2021 12:45:48 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef]
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 19, 2021 12:45:48 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713]
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 19, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-19_05_45_48-12301866298194863507?project=apache-beam-testing
Sep 19, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-19_05_45_48-12301866298194863507
Sep 19, 2021 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-19_05_45_48-12301866298194863507
Sep 19, 2021 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-19T12:45:58.478Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-j2j. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:02.396Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.259Z: Expanding SplittableParDo operations into optimizable parts.
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.296Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.359Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.417Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.447Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.491Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.585Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.621Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.649Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.677Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.709Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.744Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.769Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.799Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.836Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.867Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.893Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.941Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:03.997Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.037Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.077Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.102Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.129Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 19, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.163Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 19, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.190Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 19, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.219Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 19, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.253Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 19, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.290Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 19, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.317Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 19, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:04.666Z: Starting 5 ****s in us-central1-a...
Sep 19, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:22.205Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 19, 2021 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:46:45.023Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 19, 2021 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:47:40.608Z: Workers have started successfully.
Sep 19, 2021 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T12:47:40.633Z: Workers have started successfully.
Sep 19, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T16:00:28.211Z: Cancel request is committed for workflow job: 2021-09-19_05_45_48-12301866298194863507.
Sep 19, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T16:00:28.278Z: Cleaning up.
Sep 19, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T16:00:28.338Z: Stopping **** pool...
Sep 19, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T16:00:28.381Z: Stopping **** pool...
Sep 19, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T16:02:45.353Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 19, 2021 4:02:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-19T16:02:45.381Z: Worker pool stopped.
Sep 19, 2021 4:02:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-19_05_45_48-12301866298194863507 finished with status CANCELLED.
Load test results for test (ID): 9f9ab5d7-67bc-4a9f-9a89-69febc7742f1 and timestamp: 2021-09-19T12:45:44.285000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11502.892
dataflow_v2_java11_total_bytes_count              2.4309485E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210919124332
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed3fc659aa9c33a09f43434d8433a1041be188d58ffcbd0494f2307d5939469b
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210919124332]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed3fc659aa9c33a09f43434d8433a1041be188d58ffcbd0494f2307d5939469b]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210919124332] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed3fc659aa9c33a09f43434d8433a1041be188d58ffcbd0494f2307d5939469b])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed3fc659aa9c33a09f43434d8433a1041be188d58ffcbd0494f2307d5939469b
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed3fc659aa9c33a09f43434d8433a1041be188d58ffcbd0494f2307d5939469b
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ed3fc659aa9c33a09f43434d8433a1041be188d58ffcbd0494f2307d5939469b].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 37s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nuoby3h7nt7lo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #93

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/93/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-12740] Add option to CreateOptions to avoid GetObjectMetadata for

[noreply] Minor: Prune docker volumes in Inventory job(#15532)


------------------------------------------
[...truncated 49.52 KB...]
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
4d9c3cdf1c5b: Pushed
0be840f624d2: Pushed
c4b09fbab6de: Pushed
f1158c71e8f6: Pushed
20210918124341: digest: sha256:2d774c4ea296c365b422dde2269b3f6777600ca504a4a452b8f59df29e3c8d32 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 18, 2021 12:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 18, 2021 12:45:58 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 18, 2021 12:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 18, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 18, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 1 seconds
Sep 18, 2021 12:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 18, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 979468afaa23619dcd48e71c5232d83c781bf76100eb45c5da4c75d3f2fc2683> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-l5Ror6ojYZ3NSOccUjLYPHgb92EA60XF2kx10_L8JoM.pb
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 18, 2021 12:46:04 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376]
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 18, 2021 12:46:04 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67]
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 18, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 18, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 18, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 18, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 18, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 18, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-18_05_46_05-13954160276063115280?project=apache-beam-testing
Sep 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-18_05_46_05-13954160276063115280
Sep 18, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-18_05_46_05-13954160276063115280
Sep 18, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T12:46:12.612Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-euxk. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 18, 2021 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:16.960Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:17.905Z: Expanding SplittableParDo operations into optimizable parts.
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:17.935Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:17.982Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.042Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.073Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.118Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.192Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.216Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.249Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.281Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.316Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.346Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.380Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.403Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.436Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.462Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.495Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.517Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.555Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.585Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.612Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.633Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.658Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.688Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.722Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.753Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.786Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.810Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:18.847Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 18, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:19.167Z: Starting 5 ****s in us-central1-a...
Sep 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:46:36.282Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 18, 2021 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:47:00.929Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 18, 2021 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:47:53.708Z: Workers have started successfully.
Sep 18, 2021 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T12:47:53.743Z: Workers have started successfully.
Sep 18, 2021 3:16:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:16:20.117Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:16:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:16:23.362Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:19:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:19:23.198Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:22:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:22:20.165Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:22:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:22:23.281Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:25:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:25:23.230Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:28:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:28:20.087Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:28:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:28:23.167Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:31:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:31:23.180Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:34:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:34:20.066Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:34:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:34:23.272Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:37:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:37:23.332Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:40:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:40:20.054Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:40:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:40:23.023Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:43:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:43:23.495Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:46:20.398Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:46:24.334Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:49:23.660Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:52:20.073Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:52:23.123Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:55:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:55:23.733Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 3:58:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-18T15:58:20.125Z: Staged package beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-2X-EhRNYA0GgyvffOgWohko1OyyyPyTMuiHHvaPG9es.jar' is inaccessible.
Sep 18, 2021 3:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-18T15:58:23.376Z: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
Sep 18, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T16:00:26.411Z: Cancel request is committed for workflow job: 2021-09-18_05_46_05-13954160276063115280.
Sep 18, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T16:00:26.435Z: Cleaning up.
Sep 18, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T16:00:26.510Z: Stopping **** pool...
Sep 18, 2021 4:00:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T16:00:26.618Z: Stopping **** pool...
Sep 18, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T16:02:45.236Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 18, 2021 4:02:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-18T16:02:45.270Z: Worker pool stopped.
Sep 18, 2021 4:02:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-18_05_46_05-13954160276063115280 finished with status CANCELLED.
Load test results for test (ID): 0d676911-3e9d-42a8-b740-bd69a8b864f4 and timestamp: 2021-09-18T12:45:58.308000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11500.684
dataflow_v2_java11_total_bytes_count             2.84601622E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210918124341
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d774c4ea296c365b422dde2269b3f6777600ca504a4a452b8f59df29e3c8d32
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210918124341]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d774c4ea296c365b422dde2269b3f6777600ca504a4a452b8f59df29e3c8d32]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210918124341] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d774c4ea296c365b422dde2269b3f6777600ca504a4a452b8f59df29e3c8d32])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d774c4ea296c365b422dde2269b3f6777600ca504a4a452b8f59df29e3c8d32
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d774c4ea296c365b422dde2269b3f6777600ca504a4a452b8f59df29e3c8d32
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2d774c4ea296c365b422dde2269b3f6777600ca504a4a452b8f59df29e3c8d32].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 32s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wsdaaft7poty2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #92

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/92/display/redirect?page=changes>

Changes:

[randomstep] [BEAM-12899] Upgrade Gradle to version 6.9.x

[noreply] [BEAM-12701] Added extra parameter in to_csv for DeferredFrame to name

[zyichi] [BEAM-12603] Add retries to FnApiRunnerTest due to flakiness of grpc

[noreply] [BEAM-12535] add dataframes notebook (#15470)


------------------------------------------
[...truncated 49.92 KB...]
ebd13132d2dd: Pushed
9998d23a62f8: Pushed
49750b7dcccd: Pushed
257de4377a3c: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
8af1e1eec513: Pushed
cdf5761c3cd6: Pushed
799760671c38: Layer already exists
e625d5644106: Pushed
9a3f32baf9a8: Pushed
20210917124334: digest: sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 17, 2021 12:45:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 17, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 17, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 17, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 17, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 1 seconds
Sep 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash 68c7ddc9492113411748bb84e881c1fcc72a3845b1a74d3ca0b9986609d6b58a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aMfdyUkhE0EXSLuE6IHB_McqOEWxp008oLmYZgnWtYo.pb
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 17, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64160c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f254608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b]
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 17, 2021 12:45:28 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@41853299, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60d40ff4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b]
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-17_05_45_28-17019632363355082856?project=apache-beam-testing
Sep 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-17_05_45_28-17019632363355082856
Sep 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-17_05_45_28-17019632363355082856
Sep 17, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-17T12:45:36.321Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-4adb. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:41.847Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.587Z: Expanding SplittableParDo operations into optimizable parts.
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.623Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.696Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.767Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.800Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.867Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.983Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.023Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.055Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.083Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.107Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.138Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.194Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.227Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.261Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.294Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.327Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.382Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.419Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.453Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.487Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.520Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.559Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.592Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.626Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.651Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.682Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.719Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.749Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 17, 2021 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:44.065Z: Starting 5 ****s in us-central1-a...
Sep 17, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:56.770Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 17, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:46:23.992Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 17, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:47:23.700Z: Workers have started successfully.
Sep 17, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:47:23.730Z: Workers have started successfully.
Sep 17, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.591Z: Cancel request is committed for workflow job: 2021-09-17_05_45_28-17019632363355082856.
Sep 17, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.669Z: Cleaning up.
Sep 17, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.733Z: Stopping **** pool...
Sep 17, 2021 4:00:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.783Z: Stopping **** pool...
Sep 17, 2021 4:00:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-17T16:00:36.078Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events.
passed through:
==>
    dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
Sep 17, 2021 4:02:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:02:47.614Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 17, 2021 4:02:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:02:47.650Z: Worker pool stopped.
Sep 17, 2021 4:02:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-17_05_45_28-17019632363355082856 finished with status CANCELLED.
Load test results for test (ID): b0ec15fe-b275-441d-9c56-55e080f6e2c1 and timestamp: 2021-09-17T12:45:22.952000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11548.337
dataflow_v2_java11_total_bytes_count             2.04755766E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210917124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510
Deleted: sha256:51220a157e98970e581fc82c17ae080a7d75c198d4bd5dec5c6ff4c8c0110d2b
Deleted: sha256:1d1b931ea7dd3188599d30a0b00df141ada7a7a4a5ccf60c4e7aa7e60618b2c1
Deleted: sha256:5dbac7b2b241cce8f57417a7d9e6f39a8a6c8fa4d2d1d826f4f672a1ddd49f8f
Deleted: sha256:42a17ddefe018c5adf4da9391b84c1dd878323bb26c6cd1ce078c800506e988a
Deleted: sha256:b7db880cee92f9f53fe649bc6435667f285c74ac9f09897f045207a531478fd9
Deleted: sha256:fb08917ee3996cc5113be9f90a75c20b2eb236bbd16c3c3392f0384eba179726
Deleted: sha256:ded0f8bcd7e15dcb58d3b9f1e8f7d42efbfc06a5cfc6af904d9ec75981f1ecf2
Deleted: sha256:7628ace029b201d217f843a245a0a469e8009f9478092381a5b31efd99ea02be
Deleted: sha256:db83a98067892f629c0dc154a4a11a4d07b95f5546030efad558b78f4a8c34f7
Deleted: sha256:e6cbf762df70e1132a2ae92700d379b555096e58c03da72b5dcc4403e34dcfee
Deleted: sha256:adea99ad81c3fec89cbda76e18d733d27ed8475a65c292d5149d5255d4df37a3
Deleted: sha256:c54d27b5b88135d2c48b53071e2fffab92504a37258eba4f7070888d1a6e4a12
Deleted: sha256:aabf746fb1f34f31d269d68d4397b4fa1499c3a58cfba5815caf6b77852beac8
Deleted: sha256:548b9b1151932c97ed1c2e7b3ecf31165a2f8f6f9a97e44c898e97d1b30bdda3
Deleted: sha256:0bc79cf7c6bd3e9413c107bd1a89eaf0fa06f837792e422e0104ace9758f4e13
Deleted: sha256:66ec577042f379f6cab0f3aba2c14996fb4d892202db514b46f49f669b37828f
Deleted: sha256:9e555a5ad620739800a5be935c9bac8a0fdbbbb9f3c85a81751e64a4478c9432
Deleted: sha256:7a249063b8af977ec773a8ac90ff3fc936007d5a062f301532249dc8929b4625
Deleted: sha256:d2a7bcf00ba40eba94e52a90902d7f5dea6413542bd5883f09fb0503ae227849
Deleted: sha256:35401cb89023ae465a3ed458da089982ec2ada8929aaa9eab7b5c0774406fe97
Deleted: sha256:f5e4f0993712ead6150fe6808de2216017732a6313eae29b966dfdec41128c26
Deleted: sha256:40d95b9e1a99f529776a937184a32d7ccaec26e3c236f9072694e9492d966b46
Deleted: sha256:31db754218e7983d9d30dbf3af0b78c3866904296ace832c41e9e6b238391c34
Deleted: sha256:5e98107d8799b66fdc959243e6051aaedd58bf8fd3a2fa32f528d89608c28912
Deleted: sha256:596eff7aac946664bc43bbcacf63fa5c0759211999a5ec34c9d76a29b340d19f
Deleted: sha256:4485b5c195cb622147812aa99f66a6c759e5358c9e458836d012ac296aa44c96
Deleted: sha256:30fcfe67a4e49646910b554d343c76bd336257c8ae792f41978ea002f32e9ec9
Deleted: sha256:2a61b929e2767dac14ba8a95aa1a8c68916cdd4694f2166f73e207e7b6ecd516
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210917124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210917124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 38s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sqswlj37yu3za

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #91

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/91/display/redirect?page=changes>

Changes:

[dpcollins] [BEAM-12882] - fix test that is flaky when jenkins is overloaded

[noreply] [BEAM-12885] Enable NeedsRunner Tests for Samza Portable Runner (#15512)

[noreply] [BEAM-12100][BEAM-10379][BEAM-9514][BEAM-12647][BEAM-12099]

[noreply] [BEAM-12543] Fix DataFrrame typo (#15509)

[noreply] [BEAM-12794] Remove obsolete uses of sys.exc_info. (#15507)

[noreply] [BEAM-11666]  flake on RecordingManagerTest (#15118)

[kawaigin] [BEAM-10708] Introspect beam_sql output

[noreply] Minor: Restore "Bugfix" section in CHANGES.md (#15516)

[Kyle Weaver] [BEAM-10459] Unignore numeric aggregation tests.

[ajamato] [BEAM-12898] Disable Flink Load tests which are leading Dataproc


------------------------------------------
[...truncated 48.96 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
619f3fbf92b9: Preparing
af3a23159c7d: Preparing
1270d47666fd: Preparing
6d0fed1f7ad0: Preparing
c508d04f70e6: Preparing
2014e59c76b4: Preparing
fea88fcf18b1: Preparing
57c8a67db75b: Preparing
7fb2e83f9629: Preparing
96ac1d08cb55: Preparing
a42d90664913: Preparing
60b6f299bc46: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
7fb2e83f9629: Waiting
799760671c38: Preparing
3891808a925b: Waiting
d00da3cd7763: Waiting
d402f4f1b906: Waiting
4e61e63529c2: Waiting
799760671c38: Waiting
00ef5416d927: Waiting
8555e663f65b: Waiting
2014e59c76b4: Waiting
fea88fcf18b1: Waiting
57c8a67db75b: Waiting
a42d90664913: Waiting
60b6f299bc46: Waiting
c508d04f70e6: Pushed
1270d47666fd: Pushed
af3a23159c7d: Pushed
2014e59c76b4: Pushed
57c8a67db75b: Pushed
619f3fbf92b9: Pushed
6d0fed1f7ad0: Pushed
7fb2e83f9629: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
a42d90664913: Pushed
799760671c38: Layer already exists
60b6f299bc46: Pushed
fea88fcf18b1: Pushed
96ac1d08cb55: Pushed
20210916124428: digest: sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 16, 2021 12:48:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 16, 2021 12:48:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 16, 2021 12:48:30 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 16, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 16, 2021 12:48:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 16, 2021 12:48:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 4 seconds
Sep 16, 2021 12:48:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 16, 2021 12:48:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash c34dcce505512389ee60a4fed7f649627026e48cb41bc54621bd457366a034d5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-w03M5QVRI4nuYKT-1_ZJYnAm5Iy0G8VGIb1Fc2agNNU.pb
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 16, 2021 12:48:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711]
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 16, 2021 12:48:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639]
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 16, 2021 12:48:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 16, 2021 12:48:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-16_05_48_44-18227987266575069105?project=apache-beam-testing
Sep 16, 2021 12:48:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-16_05_48_44-18227987266575069105
Sep 16, 2021 12:48:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-16_05_48_44-18227987266575069105
Sep 16, 2021 12:48:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-16T12:48:54.833Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-ioq9. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:48:59.689Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:00.610Z: Expanding SplittableParDo operations into optimizable parts.
Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:00.656Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:00.726Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:00.802Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:00.836Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:00.916Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.036Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.073Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.109Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.143Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.197Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.243Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.281Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.325Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.355Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.395Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.437Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.478Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.531Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.598Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.663Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.707Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.770Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.818Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.882Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.929Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:01.993Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:02.045Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:02.106Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:02.757Z: Starting 5 ****s in us-central1-a...
Sep 16, 2021 12:49:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:11.875Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 16, 2021 12:49:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:49:47.462Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 16, 2021 12:50:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:50:16.270Z: Workers have started successfully.
Sep 16, 2021 12:50:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T12:50:16.309Z: Workers have started successfully.
Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T16:00:29.931Z: Cancel request is committed for workflow job: 2021-09-16_05_48_44-18227987266575069105.
Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T16:00:30.102Z: Cleaning up.
Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T16:00:30.175Z: Stopping **** pool...
Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T16:00:30.252Z: Stopping **** pool...
Sep 16, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T16:02:49.981Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 16, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-16T16:02:50.023Z: Worker pool stopped.
Sep 16, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-16_05_48_44-18227987266575069105 finished with status CANCELLED.
Load test results for test (ID): 6af5c062-d71c-461d-816b-11e5ea70a3e0 and timestamp: 2021-09-16T12:48:29.490000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11346.163
dataflow_v2_java11_total_bytes_count             2.18755428E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210916124428
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210916124428]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210916124428] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 20s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yfimtcaj4hgw4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #90

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/90/display/redirect?page=changes>

Changes:

[dhuntsperger] updated Maven-to-Gradle conversion step in Java quickstart

[noreply] [BEAM-10913] - Updating Grafana from v6.7.3 to v8.1.2 (#15503)

[noreply] [BEAM-12876] Adding doc and glossary entry for resource hints (#15499)

[noreply] [BEAM-12153] revert "implement GroupByKey with CombinePerKey with

[noreply] [BEAM-12845] Add AWS services as a runtime dependency to Spark Job


------------------------------------------
[...truncated 49.30 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
8198328d7c99: Preparing
2693c61bbf17: Preparing
afebf81fdc05: Preparing
81dc587f5cfd: Preparing
d6913312cfd2: Preparing
86557afc1619: Preparing
b99dbcf86420: Preparing
5e57d7492af4: Preparing
7aba11bbf702: Preparing
53158045ed29: Preparing
a250b7dbea7d: Preparing
f302772ac2ea: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
b99dbcf86420: Waiting
5e57d7492af4: Waiting
d00da3cd7763: Waiting
d402f4f1b906: Waiting
7aba11bbf702: Waiting
4e61e63529c2: Waiting
53158045ed29: Waiting
799760671c38: Waiting
86557afc1619: Waiting
00ef5416d927: Waiting
8555e663f65b: Waiting
3891808a925b: Waiting
a250b7dbea7d: Waiting
d6913312cfd2: Pushed
8198328d7c99: Pushed
2693c61bbf17: Pushed
afebf81fdc05: Pushed
81dc587f5cfd: Pushed
86557afc1619: Pushed
7aba11bbf702: Pushed
5e57d7492af4: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
a250b7dbea7d: Pushed
f302772ac2ea: Pushed
b99dbcf86420: Pushed
53158045ed29: Pushed
20210915124338: digest: sha256:4f331aa57513ab483d707531f198b1f395299fa5d5f4d8758bf0d57dd120e45d size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 15, 2021 12:46:05 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 15, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 15, 2021 12:46:06 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 15, 2021 12:46:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 15, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 15, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 15, 2021 12:46:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 15, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash c30ece65034eb132cbdb8afc1ead21e770f1fd42fe856009cf65eba7aed22f72> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ww7OZQNOsTLL24r8Hq0h53Dx_UL-hWAJz2Xrp67SL3I.pb
Sep 15, 2021 12:46:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 15, 2021 12:46:10 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376]
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 15, 2021 12:46:11 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67]
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 15, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 15, 2021 12:46:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-15_05_46_11-1910158258827079702?project=apache-beam-testing
Sep 15, 2021 12:46:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-15_05_46_11-1910158258827079702
Sep 15, 2021 12:46:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-15_05_46_11-1910158258827079702
Sep 15, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-15T12:46:18.254Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-lphq. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 15, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:22.220Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:22.984Z: Expanding SplittableParDo operations into optimizable parts.
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.027Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.104Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.198Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.244Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.306Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.426Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.470Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.495Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.528Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.565Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.600Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.636Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.670Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.709Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.757Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.786Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 15, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.812Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.835Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.887Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.922Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:23.973Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.006Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.052Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.079Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.108Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.142Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.176Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.213Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 15, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:24.649Z: Starting 5 ****s in us-central1-a...
Sep 15, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:46:55.358Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 15, 2021 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:47:08.765Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 15, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:47:41.143Z: Workers have started successfully.
Sep 15, 2021 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T12:47:41.182Z: Workers have started successfully.
Sep 15, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T16:00:32.489Z: Cancel request is committed for workflow job: 2021-09-15_05_46_11-1910158258827079702.
Sep 15, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T16:00:35.562Z: Cleaning up.
Sep 15, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T16:00:35.665Z: Stopping **** pool...
Sep 15, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T16:00:35.724Z: Stopping **** pool...
Sep 15, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T16:02:59.577Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 15, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-15T16:02:59.615Z: Worker pool stopped.
Sep 15, 2021 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-15_05_46_11-1910158258827079702 finished with status CANCELLED.
Load test results for test (ID): db12e9c0-bc25-43b4-b05e-dc4ae84c3998 and timestamp: 2021-09-15T12:46:05.999000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11508.297
dataflow_v2_java11_total_bytes_count             3.33595543E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210915124338
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f331aa57513ab483d707531f198b1f395299fa5d5f4d8758bf0d57dd120e45d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210915124338]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f331aa57513ab483d707531f198b1f395299fa5d5f4d8758bf0d57dd120e45d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210915124338] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f331aa57513ab483d707531f198b1f395299fa5d5f4d8758bf0d57dd120e45d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f331aa57513ab483d707531f198b1f395299fa5d5f4d8758bf0d57dd120e45d
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f331aa57513ab483d707531f198b1f395299fa5d5f4d8758bf0d57dd120e45d
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4f331aa57513ab483d707531f198b1f395299fa5d5f4d8758bf0d57dd120e45d].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 56s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tsskotcggr3to

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #89

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/89/display/redirect?page=changes>

Changes:

[rohde.samuel] [BEAM-12842] Add timestamp to test work item to deflake

[dhuntsperger] fixed broken Python tab on HCatalog IO page

[suztomo] [BEAM-12873] HL7v2IO: to leave schematizedData null, not empty

[noreply] Avoid apiary submission of job graph when it is not needed. (#15458)

[noreply] [BEAM-7261] Add support for BasicSessionCredentials for AWS credentials.

[noreply] Bump dataflow java container version to beam-master-20210913 (#15506)

[noreply] [BEAM-11980] Java GCS - Implement IO Request Count metrics (#15394)


------------------------------------------
[...truncated 49.05 KB...]
The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
1ee98579fc46: Preparing
b54e7baec2d3: Preparing
f612e1c6b2d2: Preparing
4b667ff364a2: Preparing
cc4bbd4ca4e2: Preparing
4948d45599e7: Preparing
21b5103441a9: Preparing
bc91acaffd9e: Preparing
e9f9e2ba990c: Preparing
7d8292d6730b: Preparing
12c7f9e457ef: Preparing
d37471bdb99e: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
12c7f9e457ef: Waiting
d37471bdb99e: Waiting
d00da3cd7763: Waiting
799760671c38: Waiting
21b5103441a9: Waiting
3891808a925b: Waiting
e9f9e2ba990c: Waiting
bc91acaffd9e: Waiting
d402f4f1b906: Waiting
00ef5416d927: Waiting
8555e663f65b: Waiting
7d8292d6730b: Waiting
4948d45599e7: Waiting
b54e7baec2d3: Pushed
cc4bbd4ca4e2: Pushed
f612e1c6b2d2: Pushed
4948d45599e7: Pushed
1ee98579fc46: Pushed
4b667ff364a2: Pushed
bc91acaffd9e: Pushed
e9f9e2ba990c: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
21b5103441a9: Pushed
d37471bdb99e: Pushed
12c7f9e457ef: Pushed
799760671c38: Layer already exists
7d8292d6730b: Pushed
20210914124408: digest: sha256:7ef12d2436940dd2cbf0580d73a7825d2f817dfbcbd4aca75d266b79927be9d6 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 14, 2021 12:46:22 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 14, 2021 12:46:22 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 14, 2021 12:46:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 14, 2021 12:46:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 14, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 14, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash b180b453a33235a76be50b03c08b766e5c93e37f397942242f81202c6b15ccf0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sYC0U6MyNadr5QsDwIt2blyT4385eUIkL4EgLGsVzPA.pb
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 14, 2021 12:46:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64160c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f254608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b]
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 14, 2021 12:46:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@41853299, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60d40ff4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b]
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 14, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 14, 2021 12:46:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-14_05_46_28-9592787095606509542?project=apache-beam-testing
Sep 14, 2021 12:46:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-14_05_46_28-9592787095606509542
Sep 14, 2021 12:46:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-14_05_46_28-9592787095606509542
Sep 14, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-14T12:46:35.418Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-sz6l. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:42.775Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.429Z: Expanding SplittableParDo operations into optimizable parts.
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.464Z: Expanding CollectionToSingleton operations into optimizable parts.
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.524Z: Expanding CoGroupByKey operations into optimizable parts.
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.592Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.644Z: Expanding GroupByKey operations into streaming Read/Write steps
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.710Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.811Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.846Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.879Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.914Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.947Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:43.967Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.001Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.026Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.058Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.083Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.114Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.138Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.168Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.192Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.224Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.258Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.284Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.315Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.350Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.382Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.406Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.439Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.475Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 14, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:46:44.790Z: Starting 5 ****s in us-central1-a...
Sep 14, 2021 12:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:47:16.242Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 14, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:47:28.377Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 14, 2021 12:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:47:56.988Z: Workers have started successfully.
Sep 14, 2021 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T12:47:57.025Z: Workers have started successfully.
Sep 14, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T16:00:25.222Z: Cancel request is committed for workflow job: 2021-09-14_05_46_28-9592787095606509542.
Sep 14, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T16:00:25.283Z: Cleaning up.
Sep 14, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T16:00:25.390Z: Stopping **** pool...
Sep 14, 2021 4:00:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T16:00:25.433Z: Stopping **** pool...
Sep 14, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T16:02:50.141Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Sep 14, 2021 4:02:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-14T16:02:50.176Z: Worker pool stopped.
Sep 14, 2021 4:02:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-14_05_46_28-9592787095606509542 finished with status CANCELLED.
Load test results for test (ID): 930ef99c-602c-44b6-bb94-d9037d14ce6f and timestamp: 2021-09-14T12:46:22.857000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                  11477.29
dataflow_v2_java11_total_bytes_count             2.02747331E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210914124408
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef12d2436940dd2cbf0580d73a7825d2f817dfbcbd4aca75d266b79927be9d6
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210914124408]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef12d2436940dd2cbf0580d73a7825d2f817dfbcbd4aca75d266b79927be9d6]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210914124408] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef12d2436940dd2cbf0580d73a7825d2f817dfbcbd4aca75d266b79927be9d6])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef12d2436940dd2cbf0580d73a7825d2f817dfbcbd4aca75d266b79927be9d6
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef12d2436940dd2cbf0580d73a7825d2f817dfbcbd4aca75d266b79927be9d6
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7ef12d2436940dd2cbf0580d73a7825d2f817dfbcbd4aca75d266b79927be9d6].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 19s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rvrf6gazkmdoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #88

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/88/display/redirect>

Changes:


------------------------------------------
[...truncated 40.78 KB...]
> Task :runners:google-cloud-dataflow-java:jar
> Task :sdks:java:container:java11:copyDockerfileDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:java11:copySdkHarnessLauncher
> Task :sdks:java:container:generateLicenseReport

> Task :sdks:java:container:pullLicenses
Copying already-fetched licenses from <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license> to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...
done.
DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will drop support for Python 3.5 in January 2021. pip 21.0 will remove support for this functionality.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
Collecting future<1.0.0,>=0.16.0
  Using cached future-0.18.2-py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: soupsieve, six, tenacity, pyyaml, future, beautifulsoup4
Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 six-1.16.0 soupsieve-2.1 tenacity-5.1.5
Executing <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python> <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py> --license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json>        --output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>        --dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> 
INFO:root:Pulling license for 207 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 1.776828 seconds with 16 threads.
Copying licenses from <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses> to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.>
Finished license_scripts.sh

> Task :sdks:java:container:java11:copyJavaThirdPartyLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container --save_path=/output/licenses
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
cb48aa7c3179: Preparing
6e9c6659ed75: Preparing
89019dcc934a: Preparing
e500a7c139f7: Preparing
aad82327faab: Preparing
3a1781cb8501: Preparing
aa57da95a980: Preparing
8308d19d28ab: Preparing
eceaf9948f0b: Preparing
ca6d6093da44: Preparing
1646e7e61e2f: Preparing
a4dadbe25bfa: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
eceaf9948f0b: Waiting
3a1781cb8501: Waiting
ca6d6093da44: Waiting
1646e7e61e2f: Waiting
a4dadbe25bfa: Waiting
8308d19d28ab: Waiting
aa57da95a980: Waiting
3891808a925b: Waiting
8555e663f65b: Waiting
d402f4f1b906: Waiting
d00da3cd7763: Waiting
799760671c38: Waiting
aad82327faab: Pushed
6e9c6659ed75: Pushed
89019dcc934a: Pushed
cb48aa7c3179: Pushed
3a1781cb8501: Pushed
e500a7c139f7: Pushed
8308d19d28ab: Pushed
eceaf9948f0b: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
1646e7e61e2f: Pushed
d00da3cd7763: Layer already exists
a4dadbe25bfa: Pushed
799760671c38: Layer already exists
4e61e63529c2: Layer already exists
aa57da95a980: Pushed
ca6d6093da44: Pushed
20210913124334: digest: sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 13, 2021 12:45:19 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 13, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 13, 2021 12:45:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 13, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 13, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 13, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 13, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 13, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash fb1c024c1df5d5db05a96558d9dc52719b1cd78cffbc301084b5614ebc8c7438> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--xwCTB311dsFqWVY2dxScZsc14z_vDAQhLVhTryMdDg.pb
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 13, 2021 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef]
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 13, 2021 12:45:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713]
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_45_25-8165500818705185032?project=apache-beam-testing
Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-13_05_45_25-8165500818705185032
Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-13_05_45_25-8165500818705185032
Sep 13, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-13T12:45:32.952Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-3zb5. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:38.295Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-13T12:45:39.084Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24386 instances, 2/0 CPUs, 30/183701 disk GB, 0/2397 SSD disk GB, 1/288 instance groups, 1/291 managed instance groups, 1/517 instance templates, 1/615 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:39.118Z: Cleaning up.
Sep 13, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:39.176Z: Worker pool stopped.
Sep 13, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:40.339Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 13, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-13_05_45_25-8165500818705185032 failed with status FAILED.
Sep 13, 2021 12:45:44 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): 81bc422d-745c-4dac-85e0-d7a45094d161 and timestamp: 2021-09-13T12:45:20.539000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913124334
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913124334]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913124334] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 29s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/klzclaj5rpfdc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #87

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/87/display/redirect>

Changes:


------------------------------------------
[...truncated 41.96 KB...]
Using base prefix '/usr'
New python executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...
done.
DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will drop support for Python 3.5 in January 2021. pip 21.0 will remove support for this functionality.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
Collecting future<1.0.0,>=0.16.0
  Using cached future-0.18.2-py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: soupsieve, six, tenacity, pyyaml, future, beautifulsoup4
Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 six-1.16.0 soupsieve-2.1 tenacity-5.1.5
Executing <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python> <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py> --license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json>        --output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>        --dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> 
INFO:root:Pulling license for 207 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 1.993543 seconds with 16 threads.
Copying licenses from <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses> to <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.>

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:container:pullLicenses
Finished license_scripts.sh

> Task :sdks:java:container:java11:copyJavaThirdPartyLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container --save_path=/output/licenses
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
965419297464: Preparing
af934d8ebe65: Preparing
a7b8054e29c9: Preparing
098bbdafffcb: Preparing
a28016f210da: Preparing
01a67a58c544: Preparing
3dbf36d043e0: Preparing
428bd452b211: Preparing
e387380d47be: Preparing
84336d956992: Preparing
c88bf45cc2c4: Preparing
eb070d43fe59: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
428bd452b211: Waiting
84336d956992: Waiting
c88bf45cc2c4: Waiting
e387380d47be: Waiting
8555e663f65b: Waiting
01a67a58c544: Waiting
d00da3cd7763: Waiting
eb070d43fe59: Waiting
799760671c38: Waiting
4e61e63529c2: Waiting
3dbf36d043e0: Waiting
00ef5416d927: Waiting
d402f4f1b906: Waiting
a7b8054e29c9: Pushed
a28016f210da: Pushed
af934d8ebe65: Pushed
965419297464: Pushed
01a67a58c544: Pushed
098bbdafffcb: Pushed
428bd452b211: Pushed
e387380d47be: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
3dbf36d043e0: Pushed
d00da3cd7763: Layer already exists
eb070d43fe59: Pushed
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
c88bf45cc2c4: Pushed
84336d956992: Pushed
20210912124340: digest: sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4 size: 4311

> Task :sdks:java:testing:load-tests:run
Sep 12, 2021 12:46:09 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 12, 2021 12:46:10 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 12, 2021 12:46:11 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 12, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 12, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 12, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds
Sep 12, 2021 12:46:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 12, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <106569 bytes, hash e9f936bb1b87ac265f45e5791c0ab45d144a7985ab86aa04cca515fdea242295> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6fk2uxuHrCZfReV5HAq0XRRKeYWrhqoEzKUV_eokIpU.pb
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 12, 2021 12:46:16 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64160c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f254608, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b]
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Sep 12, 2021 12:46:16 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@41853299, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60d40ff4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b]
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 12, 2021 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_46_16-1467752297453580481?project=apache-beam-testing
Sep 12, 2021 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-12_05_46_16-1467752297453580481
Sep 12, 2021 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-12_05_46_16-1467752297453580481
Sep 12, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-12T12:46:24.495Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-4g7j. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 12, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:29.782Z: Worker configuration: e2-standard-2 in us-central1-a.
Sep 12, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-12T12:46:30.604Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24378 instances, 2/0 CPUs, 30/186331 disk GB, 0/2397 SSD disk GB, 1/280 instance groups, 1/283 managed instance groups, 1/508 instance templates, 1/607 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 12, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:30.642Z: Cleaning up.
Sep 12, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:30.693Z: Worker pool stopped.
Sep 12, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:31.857Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 12, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-12_05_46_16-1467752297453580481 failed with status FAILED.
Sep 12, 2021 12:46:38 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): 34b8d352-cbb2-47e4-9b8b-76c9e2782560 and timestamp: 2021-09-12T12:46:10.674000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912124340
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912124340]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912124340] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 18s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/az74l4n3vnsjm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org