You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/11/12 12:40:50 UTC

Build failed in Jenkins: beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11 #510

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/510/display/redirect?page=changes>

Changes:

[noreply] Add TFX support in pydoc (#23960)

[noreply] Bump cloud.google.com/go/bigtable from 1.17.0 to 1.18.0 in /sdks

[noreply] disable (#24121)

[noreply] Implement PubsubRowToMessage transform (#23897)

[noreply] upgrade testcontainer dependency (#24123)

[noreply] More cleanup containers (#24105)

[noreply] Bump github.com/aws/aws-sdk-go-v2/service/s3 in /sdks (#24112)

[noreply] Bump google.golang.org/api from 0.102.0 to 0.103.0 in /sdks (#24049)

[noreply] Update staging of Python wheels (#24114)

[noreply] Add a ValidatesContainer integration test for use_sibling_sdk_workers

[noreply] Fix checkArgument format string in TestStream (#24134)


------------------------------------------
[...truncated 46.80 KB...]
/home/jenkins/go/bin/go1.18.1 build -o ./build/target/linux_amd64/boot boot.go boot_test.go

> Task :sdks:java:container:java11:copySdkHarnessLauncher
Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons:
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency for more details about this problem.
  - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency for more details about this problem.

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
2c3247d8d9d9: Preparing
a09705d9ff33: Preparing
00d7e4c5bd9c: Preparing
df54ad3f2a4d: Preparing
40f8c900a0d5: Preparing
0281f852dc48: Preparing
1796ef2dd894: Preparing
649df817948b: Preparing
82e1ec896c87: Preparing
513b3ec36fa2: Preparing
b3ced96cf542: Preparing
dcaaeced3cdd: Preparing
21daa7260a5b: Preparing
fa460405b624: Preparing
cc7c584cbb2d: Preparing
7b7f3078e1db: Preparing
826c3ddbb29c: Preparing
b626401ef603: Preparing
9b55156abf26: Preparing
293d5db30c9f: Preparing
03127cdb479b: Preparing
9c742cd6c7a5: Preparing
dcaaeced3cdd: Waiting
513b3ec36fa2: Waiting
21daa7260a5b: Waiting
b3ced96cf542: Waiting
649df817948b: Waiting
fa460405b624: Waiting
9b55156abf26: Waiting
82e1ec896c87: Waiting
9c742cd6c7a5: Waiting
cc7c584cbb2d: Waiting
293d5db30c9f: Waiting
03127cdb479b: Waiting
b626401ef603: Waiting
826c3ddbb29c: Waiting
1796ef2dd894: Waiting
0281f852dc48: Waiting
40f8c900a0d5: Pushed
df54ad3f2a4d: Pushed
a09705d9ff33: Pushed
00d7e4c5bd9c: Pushed
2c3247d8d9d9: Pushed
649df817948b: Pushed
1796ef2dd894: Pushed
513b3ec36fa2: Pushed
0281f852dc48: Pushed
82e1ec896c87: Pushed
b3ced96cf542: Pushed
7b7f3078e1db: Layer already exists
dcaaeced3cdd: Pushed
826c3ddbb29c: Layer already exists
b626401ef603: Layer already exists
9b55156abf26: Layer already exists
293d5db30c9f: Layer already exists
03127cdb479b: Layer already exists
9c742cd6c7a5: Layer already exists
fa460405b624: Pushed
cc7c584cbb2d: Pushed
21daa7260a5b: Pushed
20221112123732: digest: sha256:d2612813e8fd0b56018f7eb92ab9801c354564e6ccb4a081481f4f7c4f1584d1 size: 4935

> Task :sdks:java:testing:load-tests:run
Nov 12, 2022 12:38:38 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 12, 2022 12:38:39 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 229 files. Enable logging at DEBUG level to see which files will be staged.
Nov 12, 2022 12:38:39 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor)
Nov 12, 2022 12:38:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 12, 2022 12:38:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 229 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 12, 2022 12:38:42 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 229 files cached, 0 files newly uploaded in 0 seconds
Nov 12, 2022 12:38:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 12, 2022 12:38:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <117744 bytes, hash 2467e98cda8059d848f011008f9e31c4efa529d8a0869199f5d42bdb11958dcb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-JGfpjNqAWdhI8BEAj54xxO-lKdighpGZ9dQr2xGVjcs.pb
Nov 12, 2022 12:38:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 12, 2022 12:38:44 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}]
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.44.0-SNAPSHOT
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-12_04_38_44-6440028729128725637?project=apache-beam-testing
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-11-12_04_38_44-6440028729128725637
Nov 12, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-11-12_04_38_44-6440028729128725637
Nov 12, 2022 12:38:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-11-12T12:38:49.031Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0pardo01-jenkins-11-49y3. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 12, 2022 12:38:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:56.700Z: Worker configuration: e2-standard-2 in us-central1-a.
Nov 12, 2022 12:38:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:57.714Z: Expanding SplittableParDo operations into optimizable parts.
Nov 12, 2022 12:38:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:57.747Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 12, 2022 12:38:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:57.859Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 12, 2022 12:38:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:57.884Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 12, 2022 12:38:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:57.909Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:57.944Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.021Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.060Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.084Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.110Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.144Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.169Z: Fusing consumer ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.188Z: Fusing consumer ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.212Z: Fusing consumer Step: 0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.244Z: Fusing consumer Step: 1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.278Z: Fusing consumer Step: 2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.304Z: Fusing consumer Step: 3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.332Z: Fusing consumer Step: 4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.358Z: Fusing consumer Step: 5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.390Z: Fusing consumer Step: 6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.423Z: Fusing consumer Step: 7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.458Z: Fusing consumer Step: 8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.491Z: Fusing consumer Step: 9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.525Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 9/ParMultiDo(CounterOperation)
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:58.619Z: Running job using Streaming Engine
Nov 12, 2022 12:39:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:38:59.915Z: Starting 5 ****s in us-central1-a...
Nov 12, 2022 12:39:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:39:06.680Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 12, 2022 12:39:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:39:41.797Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.

> Task :sdks:java:testing:load-tests:run FAILED
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=2e082f80-63e7-4ed0-9b5f-ac6514b2a219, currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 300518
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-300518.out.log
----- Last  20 lines from daemon log file - daemon-300518.out.log -----
Nov 12, 2022 12:39:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:39:06.680Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 12, 2022 12:39:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-12T12:39:41.797Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
Remove shutdown hook failed
java.lang.IllegalStateException: Shutdown in progress
	at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82)
	at java.lang.Runtime.removeShutdownHook(Runtime.java:231)
	at org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:38)
	at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:208)
	at org.gradle.process.internal.DefaultExecHandle.aborted(DefaultExecHandle.java:366)
	at org.gradle.process.internal.ExecHandleRunner.completed(ExecHandleRunner.java:108)
	at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:84)
	at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11 #511

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/511/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org