You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/07/22 12:24:29 UTC

Build failed in Jenkins: beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11 #397

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/397/display/redirect?page=changes>

Changes:

[balazs.nemeth] BEAM-14525 Fix for Protobuf getter/setter method name discovery issue

[balazs.nemeth] BEAM-14525 Added a proto message with the problematic properties to use

[balazs.nemeth] PR CR: updating issue links

[noreply] added olehborysevych as collaborator (#22391)

[noreply] Add accept-language header for MPL license (#22395)

[noreply] Bump terser from 5.9.0 to 5.14.2 in

[noreply] Fixes #22156: Fix Spark3 runner to compile against Spark 3.2/3.3 and add


------------------------------------------
[...truncated 48.71 KB...]
+ go-licenses csv github.com/apache/beam/sdks/v2/java/container
+ tee /output/licenses/list.csv
W0722 12:05:56.842882     189 library.go:94] "golang.org/x/sys/unix" contains non-Go code that can't be inspected for further dependencies:
/go/pkg/mod/golang.org/x/sys@v0.0.0-20220520151302-bc2c85ada10a/unix/asm_linux_amd64.s
github.com/apache/beam/sdks/v2/go/pkg/beam,https://github.com/apache/beam/blob/sdks/v2.40.0/sdks/go/README.md,Apache-2.0
github.com/apache/beam/sdks/v2/java/container,https://github.com/apache/beam/blob/sdks/v2.40.0/sdks/LICENSE,Apache-2.0
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/v1.5.2/LICENSE,BSD-3-Clause
golang.org/x/net,https://cs.opensource.google/go/x/net/+/c690dde0:LICENSE,BSD-3-Clause
golang.org/x/sys,https://cs.opensource.google/go/x/sys/+/bc2c85ad:LICENSE,BSD-3-Clause
golang.org/x/text,https://cs.opensource.google/go/x/text/+/v0.3.7:LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/e326c6e8e9c8/LICENSE,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/v1.47.0/LICENSE,Apache-2.0
google.golang.org/protobuf,https://github.com/protocolbuffers/protobuf-go/blob/v1.28.0/LICENSE,BSD-3-Clause
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
0724dca2e633: Preparing
acfface8f1f9: Preparing
7a3a3a13be3a: Preparing
e0040b21a755: Preparing
9387b6c883e3: Preparing
1df2ec4c75d7: Preparing
e24cb9b31695: Preparing
15325c6a4fb2: Preparing
4c9a918c3252: Preparing
91a1dd420386: Preparing
b18a7f59ba31: Preparing
47bb3dbb0d36: Preparing
e24cb9b31695: Waiting
1bf524557a40: Preparing
1b0d13ba195e: Preparing
15325c6a4fb2: Waiting
e5ac3c85d294: Preparing
9791b94d7b98: Preparing
1bf524557a40: Waiting
b18a7f59ba31: Waiting
2f1e2f8ca577: Preparing
4dc3dda529a0: Preparing
7372faf8e603: Preparing
9be7f4e74e71: Preparing
36cd374265f4: Preparing
5bdeef4a08f3: Preparing
9be7f4e74e71: Waiting
36cd374265f4: Waiting
4dc3dda529a0: Waiting
47bb3dbb0d36: Waiting
7372faf8e603: Waiting
e5ac3c85d294: Waiting
acfface8f1f9: Pushed
7a3a3a13be3a: Pushed
9387b6c883e3: Pushed
e0040b21a755: Pushed
0724dca2e633: Pushed
15325c6a4fb2: Pushed
e24cb9b31695: Pushed
91a1dd420386: Pushed
1df2ec4c75d7: Pushed
4c9a918c3252: Pushed
47bb3dbb0d36: Pushed
9791b94d7b98: Layer already exists
b18a7f59ba31: Pushed
2f1e2f8ca577: Layer already exists
4dc3dda529a0: Layer already exists
7372faf8e603: Layer already exists
9be7f4e74e71: Layer already exists
5bdeef4a08f3: Layer already exists
36cd374265f4: Layer already exists
e5ac3c85d294: Pushed
1b0d13ba195e: Pushed
1bf524557a40: Pushed
20220722120440: digest: sha256:a29815605a9b04e99b1097f3cecd4ac8ddf05ab82b5ccffe8dd784f3fd8f66ae size: 4935

> Task :sdks:java:testing:load-tests:run
Jul 22, 2022 12:06:42 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 22, 2022 12:06:43 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged.
Jul 22, 2022 12:06:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jul 22, 2022 12:06:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jul 22, 2022 12:06:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds
Jul 22, 2022 12:06:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jul 22, 2022 12:06:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <98394 bytes, hash 73a9237587c4da199e235f6605abaea85f9a21693d7fbc2495b0894675188601> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-c6kjdYfE2hmeI19mBauuqF-aIWk9f7wklbCJRnUYhgE.pb
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jul 22, 2022 12:06:50 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=20000000}, SyntheticUnboundedSource{startOffset=20000000, endOffset=30000000}, SyntheticUnboundedSource{startOffset=30000000, endOffset=40000000}, SyntheticUnboundedSource{startOffset=40000000, endOffset=50000000}, SyntheticUnboundedSource{startOffset=50000000, endOffset=60000000}, SyntheticUnboundedSource{startOffset=60000000, endOffset=70000000}, SyntheticUnboundedSource{startOffset=70000000, endOffset=80000000}, SyntheticUnboundedSource{startOffset=80000000, endOffset=90000000}, SyntheticUnboundedSource{startOffset=90000000, endOffset=100000000}, SyntheticUnboundedSource{startOffset=100000000, endOffset=110000000}, SyntheticUnboundedSource{startOffset=110000000, endOffset=120000000}, SyntheticUnboundedSource{startOffset=120000000, endOffset=130000000}, SyntheticUnboundedSource{startOffset=130000000, endOffset=140000000}, SyntheticUnboundedSource{startOffset=140000000, endOffset=150000000}, SyntheticUnboundedSource{startOffset=150000000, endOffset=160000000}, SyntheticUnboundedSource{startOffset=160000000, endOffset=170000000}, SyntheticUnboundedSource{startOffset=170000000, endOffset=180000000}, SyntheticUnboundedSource{startOffset=180000000, endOffset=190000000}, SyntheticUnboundedSource{startOffset=190000000, endOffset=200000000}]
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics as step s3
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Total bytes monitor as step s4
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Group by key (0) as step s6
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate (0) as step s7
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics (0) as step s8
Jul 22, 2022 12:06:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
Jul 22, 2022 12:06:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-22_05_06_50-10102100034116914747?project=apache-beam-testing
Jul 22, 2022 12:06:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-07-22_05_06_50-10102100034116914747
Jul 22, 2022 12:06:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-07-22_05_06_50-10102100034116914747
Jul 22, 2022 12:06:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-07-22T12:06:55.546Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0gbk01-jenkins-0722-ag2m. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:02.289Z: Worker configuration: e2-standard-2 in us-central1-a.
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.095Z: Expanding SplittableParDo operations into optimizable parts.
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.123Z: Expanding CollectionToSingleton operations into optimizable parts.
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.190Z: Expanding CoGroupByKey operations into optimizable parts.
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.312Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.342Z: Expanding GroupByKey operations into streaming Read/Write steps
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.403Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.517Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.549Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.578Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.601Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.627Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.661Z: Fusing consumer Collect start time metrics/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jul 22, 2022 12:07:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.721Z: Fusing consumer Total bytes monitor/ParMultiDo(ByteMonitor) into Collect start time metrics/ParMultiDo(TimeMonitor)
Jul 22, 2022 12:07:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.775Z: Fusing consumer Window.Into()/Window.Assign into Total bytes monitor/ParMultiDo(ByteMonitor)
Jul 22, 2022 12:07:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.850Z: Fusing consumer Group by key (0)/WriteStream into Window.Into()/Window.Assign
Jul 22, 2022 12:07:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.886Z: Fusing consumer Group by key (0)/MergeBuckets into Group by key (0)/ReadStream
Jul 22, 2022 12:07:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.923Z: Fusing consumer Ungroup and reiterate (0)/ParMultiDo(UngroupAndReiterate) into Group by key (0)/MergeBuckets
Jul 22, 2022 12:07:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:03.961Z: Fusing consumer Collect end time metrics (0)/ParMultiDo(TimeMonitor) into Ungroup and reiterate (0)/ParMultiDo(UngroupAndReiterate)
Jul 22, 2022 12:07:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:04.078Z: Running job using Streaming Engine
Jul 22, 2022 12:07:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:04.418Z: Starting 5 ****s in us-central1-a...
Jul 22, 2022 12:07:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:30.572Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jul 22, 2022 12:07:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:07:42.742Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jul 22, 2022 12:08:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:08:42.023Z: Workers have started successfully.
Jul 22, 2022 12:23:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:23:21.770Z: Cleaning up.
Jul 22, 2022 12:23:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:23:21.877Z: Stopping **** pool...
Jul 22, 2022 12:23:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:23:21.933Z: Stopping **** pool...
Jul 22, 2022 12:24:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:24:02.615Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jul 22, 2022 12:24:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-22T12:24:02.703Z: Worker pool stopped.
Jul 22, 2022 12:24:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-07-22_05_06_50-10102100034116914747 finished with status DONE.
Load test results for test (ID): cc87a76d-24ee-4faf-b588-f1bbbc3234f5 and timestamp: 2022-07-22T12:06:43.618000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                   792.773
dataflow_v2_java11_total_bytes_count               1.9999998E9

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220722120440
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a29815605a9b04e99b1097f3cecd4ac8ddf05ab82b5ccffe8dd784f3fd8f66ae
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220722120440]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a29815605a9b04e99b1097f3cecd4ac8ddf05ab82b5ccffe8dd784f3fd8f66ae]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220722120440] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a29815605a9b04e99b1097f3cecd4ac8ddf05ab82b5ccffe8dd784f3fd8f66ae])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a29815605a9b04e99b1097f3cecd4ac8ddf05ab82b5ccffe8dd784f3fd8f66ae
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a29815605a9b04e99b1097f3cecd4ac8ddf05ab82b5ccffe8dd784f3fd8f66ae
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a29815605a9b04e99b1097f3cecd4ac8ddf05ab82b5ccffe8dd784f3fd8f66ae].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5562b300e52fd7448f8733d8d982d4ee9a3932ce7e989e98d2b003cf83ec9c78
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5562b300e52fd7448f8733d8d982d4ee9a3932ce7e989e98d2b003cf83ec9c78
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Fri, 22 Jul 2022 12:24:25 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 'sha256:5562b300e52fd7448f8733d8d982d4ee9a3932ce7e989e98d2b003cf83ec9c78': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 298

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 20m 1s
110 actionable tasks: 73 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ujtks4wyrn5qk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11 #398

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/398/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org