You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/02/05 16:01:47 UTC

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_Streaming #554

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/554/display/redirect?page=changes>

Changes:

[Boyuan Zhang] [BEAM-11325] Support KafkaIO dynamic read

[Kyle Weaver] [BEAM-10925] Enable user-defined Java scalar functions in ZetaSQL.

[sychen] Fix the check on maxBufferingDuration

[Kyle Weaver] address review comments

[noreply] Remove an unused reference to staleTimerSet and reword the commentary.

[noreply] [BEAM-11715] Partial revert of "Combiner packing in Dataflow" (#13763)


------------------------------------------
[...truncated 26.15 KB...]
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar FROM-CACHE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
Feb 05, 2021 12:23:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 05, 2021 12:23:18 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 187 files. Enable logging at DEBUG level to see which files will be staged.
Feb 05, 2021 12:23:18 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 05, 2021 12:23:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 05, 2021 12:23:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 188 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 05, 2021 12:23:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-A6m4bxgyKMMyKxqw0gkVdW8WGw4nJLEEWxooQpL0-zk.jar
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 188 files cached, 0 files newly uploaded in 0 seconds
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 05, 2021 12:23:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dc79225, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30e9ca13, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46185a1b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51288417, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60cf62ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0895f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1ac4ccad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd9ebde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14982a82, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ee5b2d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72f8ae0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@323f3c96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6726cc69, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b6d92e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33899f7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7899de11, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@290d10ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bc0d349, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@644ded04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5292ceca]
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 05, 2021 12:23:22 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@252a8aae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d4e405e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54e2fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70972170, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@119aa36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e1a46fb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69fe0ed4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20ab3e3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6caf7803, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@709ed6f3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@698fee9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@102c577f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d44a19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1fb2d5e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1716e8c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6573d2f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4052c8c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@181b8c4b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38eb0f4d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@437486cd]
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <94483 bytes, hash 4b592562eacf0fed403a253ca04d14a14a8117ece508589f972fd6acbb57b662> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-S1klYurPD-1AOiU8oE0UoUqBF-zlCFifly_WrLtXtmI.pb
Feb 05, 2021 12:23:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
Feb 05, 2021 12:23:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-05_04_23_23-6037329587660715791?project=apache-beam-testing
Feb 05, 2021 12:23:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-02-05_04_23_23-6037329587660715791
Feb 05, 2021 12:23:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-02-05_04_23_23-6037329587660715791
Feb 05, 2021 12:23:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-02-05T12:23:26.726Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0cogbk01-jenkins-0205122-voe0. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:30.157Z: Worker configuration: n1-standard-4 in us-central1-f.
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:30.830Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.025Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.046Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.112Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.222Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.246Z: Unzipping flatten s11 for input s10.org.apache.beam.sdk.values.PCollection.<init>:402#5f2ef1f005ae0b4
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.282Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable1
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.305Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable0
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.329Z: Fusing consumer Read input/StripIds into Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.367Z: Fusing consumer Read co-input/StripIds into Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.414Z: Fusing consumer Collect start time metrics (co-input) into Read co-input/StripIds
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.441Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.491Z: Fusing consumer CoGroupByKey/MakeUnionTable1 into Window.Into()2/Window.Assign
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.524Z: Fusing consumer Collect start time metrics (input) into Read input/StripIds
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.565Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.596Z: Fusing consumer CoGroupByKey/MakeUnionTable0 into Window.Into()/Window.Assign
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.646Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.685Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn into CoGroupByKey/GBK/MergeBuckets
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.711Z: Fusing consumer Ungroup and reiterate into CoGroupByKey/ConstructCoGbkResultFn
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.739Z: Fusing consumer Collect total bytes into Ungroup and reiterate
Feb 05, 2021 12:23:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:31.767Z: Fusing consumer Collect end time metrics into Collect total bytes
Feb 05, 2021 12:23:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:32.147Z: Executing operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics
Feb 05, 2021 12:23:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:32.180Z: Executing operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 05, 2021 12:23:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:32.210Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 05, 2021 12:23:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:32.266Z: Starting 5 ****s in us-central1-f...
Feb 05, 2021 12:24:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:23:59.626Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 05, 2021 12:24:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:24:15.913Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 05, 2021 12:24:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:24:15.961Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
Feb 05, 2021 12:24:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:24:26.488Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 05, 2021 12:24:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:24:49.165Z: Workers have started successfully.
Feb 05, 2021 12:24:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T12:24:49.190Z: Workers have started successfully.
Feb 05, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:00:34.410Z: Cancel request is committed for workflow job: 2021-02-05_04_23_23-6037329587660715791.
Feb 05, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:00:34.432Z: Finished operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics
Feb 05, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:00:34.433Z: Finished operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 05, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:00:34.434Z: Finished operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 05, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:00:34.679Z: Cleaning up.
Feb 05, 2021 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:00:34.798Z: Stopping **** pool...
Feb 05, 2021 4:01:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:01:35.863Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 05, 2021 4:01:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-05T16:01:35.908Z: Worker pool stopped.
Feb 05, 2021 4:01:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-02-05_04_23_23-6037329587660715791 finished with status CANCELLED.
Feb 05, 2021 4:01:45 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): 33a15c6e-2397-4dc6-988f-bd90b4f2d802 and timestamp: 2021-02-05T12:23:18.391000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                  12225.22
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 39m 25s
90 actionable tasks: 56 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/2g5e77y6vduvc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_CoGBK_Dataflow_Streaming #557

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/557/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_Streaming #556

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/556/display/redirect>

Changes:


------------------------------------------
[...truncated 82.15 KB...]
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava UP-TO-DATE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava UP-TO-DATE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar UP-TO-DATE

> Task :sdks:java:testing:load-tests:run
Feb 07, 2021 12:42:54 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 07, 2021 12:42:54 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 187 files. Enable logging at DEBUG level to see which files will be staged.
Feb 07, 2021 12:42:55 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 07, 2021 12:42:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 07, 2021 12:42:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 188 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 07, 2021 12:42:58 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-A6m4bxgyKMMyKxqw0gkVdW8WGw4nJLEEWxooQpL0-zk.jar
Feb 07, 2021 12:42:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 188 files cached, 0 files newly uploaded in 0 seconds
Feb 07, 2021 12:42:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 07, 2021 12:42:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30e9ca13, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46185a1b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51288417, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60cf62ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0895f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1ac4ccad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd9ebde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14982a82, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ee5b2d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72f8ae0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@323f3c96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6726cc69, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b6d92e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33899f7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7899de11, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@290d10ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bc0d349, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@644ded04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5292ceca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@13d9261f]
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 07, 2021 12:42:59 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d4e405e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54e2fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70972170, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@119aa36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e1a46fb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69fe0ed4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20ab3e3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6caf7803, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@709ed6f3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@698fee9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@102c577f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d44a19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1fb2d5e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1716e8c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6573d2f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4052c8c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@181b8c4b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38eb0f4d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@437486cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15b642b9]
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 07, 2021 12:42:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
Feb 07, 2021 12:43:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-07_04_42_59-9940591228842353327?project=apache-beam-testing
Feb 07, 2021 12:43:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-02-07_04_42_59-9940591228842353327
Feb 07, 2021 12:43:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-02-07_04_42_59-9940591228842353327
Feb 07, 2021 12:43:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-02-07T12:43:02.677Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0cogbk02-jenkins-0207124-3q7q. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 07, 2021 12:43:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:06.094Z: Worker configuration: n1-standard-4 in us-central1-f.
Feb 07, 2021 12:43:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:06.813Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 07, 2021 12:43:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:06.867Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 07, 2021 12:43:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:06.891Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 07, 2021 12:43:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:06.950Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 07, 2021 12:43:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.045Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 07, 2021 12:43:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.073Z: Unzipping flatten s11 for input s10.org.apache.beam.sdk.values.PCollection.<init>:402#5f2ef1f005ae0b4
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.112Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable1
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.140Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable0
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.174Z: Fusing consumer Read input/StripIds into Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.204Z: Fusing consumer Read co-input/StripIds into Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.238Z: Fusing consumer Collect start time metrics (co-input) into Read co-input/StripIds
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.263Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.295Z: Fusing consumer CoGroupByKey/MakeUnionTable1 into Window.Into()2/Window.Assign
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.328Z: Fusing consumer Collect start time metrics (input) into Read input/StripIds
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.371Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.402Z: Fusing consumer CoGroupByKey/MakeUnionTable0 into Window.Into()/Window.Assign
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.430Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.453Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn into CoGroupByKey/GBK/MergeBuckets
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.492Z: Fusing consumer Ungroup and reiterate into CoGroupByKey/ConstructCoGbkResultFn
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.520Z: Fusing consumer Collect total bytes into Ungroup and reiterate
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:07.550Z: Fusing consumer Collect end time metrics into Collect total bytes
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:08.090Z: Executing operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:08.129Z: Executing operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:08.165Z: Starting 5 ****s in us-central1-f...
Feb 07, 2021 12:43:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:08.166Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 07, 2021 12:43:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:39.124Z: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 07, 2021 12:43:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:39.153Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
Feb 07, 2021 12:43:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:41.425Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 07, 2021 12:43:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:43:49.639Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 07, 2021 12:44:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:44:13.147Z: Workers have started successfully.
Feb 07, 2021 12:44:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T12:44:13.176Z: Workers have started successfully.
Feb 07, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:00:33.411Z: Cancel request is committed for workflow job: 2021-02-07_04_42_59-9940591228842353327.
Feb 07, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:00:33.433Z: Finished operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics
Feb 07, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:00:33.434Z: Finished operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 07, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:00:33.434Z: Finished operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 07, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:00:33.689Z: Cleaning up.
Feb 07, 2021 4:00:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:00:33.798Z: Stopping **** pool...
Feb 07, 2021 4:01:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:01:36.559Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 07, 2021 4:01:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-07T16:01:36.596Z: Worker pool stopped.
Feb 07, 2021 4:01:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-02-07_04_42_59-9940591228842353327 finished with status CANCELLED.
Feb 07, 2021 4:01:44 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): d937ecac-b8b3-4c41-8eef-6e329e94221b and timestamp: 2021-02-07T12:42:55.293000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                  9319.466
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 18m 58s
90 actionable tasks: 1 executed, 89 up-to-date

Publishing build scan...
https://gradle.com/s/4phnev5kybs74

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_Streaming #555

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/555/display/redirect?page=changes>

Changes:

[ramazan.yapparov] Renamed build.gradle to build.gradle.kts

[ramazan.yapparov] Migrated build.gradle file to Kotlin script.

[ramazan.yapparov] Updated autolabeler.yml

[noreply] [BEAM-10961] Enable strict dependency checking for

[Robert Bradshaw] [BEAM-11723] Avoid eliminate_common_key_with_none without combiner

[ningk] [BEAM-11045] Advance chrome version for screen diff integration test

[noreply] [BEAM-11731][BEAM-10582] Allow pyarrow<4,numpy<1.21.0, improve pyarrow


------------------------------------------
[...truncated 23.78 KB...]
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:fn-execution:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar FROM-CACHE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:run
Feb 06, 2021 12:23:02 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 06, 2021 12:23:03 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 187 files. Enable logging at DEBUG level to see which files will be staged.
Feb 06, 2021 12:23:03 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 06, 2021 12:23:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Feb 06, 2021 12:23:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 188 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Feb 06, 2021 12:23:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-A6m4bxgyKMMyKxqw0gkVdW8WGw4nJLEEWxooQpL0-zk.jar
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 188 files cached, 0 files newly uploaded in 0 seconds
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Feb 06, 2021 12:23:08 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b8bb184, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@472a11ae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dc79225, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30e9ca13, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46185a1b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51288417, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60cf62ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0895f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1ac4ccad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd9ebde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14982a82, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ee5b2d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72f8ae0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@323f3c96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6726cc69, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b6d92e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33899f7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7899de11, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@290d10ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bc0d349]
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Feb 06, 2021 12:23:08 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f3da8b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5634d0f4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@252a8aae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d4e405e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54e2fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70972170, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@119aa36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e1a46fb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69fe0ed4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20ab3e3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6caf7803, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@709ed6f3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@698fee9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@102c577f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d44a19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1fb2d5e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1716e8c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6573d2f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4052c8c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@181b8c4b]
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 06, 2021 12:23:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 06, 2021 12:23:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
Feb 06, 2021 12:23:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-06_04_23_09-7813678601342489977?project=apache-beam-testing
Feb 06, 2021 12:23:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-02-06_04_23_09-7813678601342489977
Feb 06, 2021 12:23:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-02-06_04_23_09-7813678601342489977
Feb 06, 2021 12:23:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-02-06T12:23:12.968Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0cogbk01-jenkins-0206122-wwl8. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.097Z: Worker configuration: n1-standard-4 in us-central1-f.
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.632Z: Expanding CoGroupByKey operations into optimizable parts.
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.697Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.717Z: Expanding GroupByKey operations into streaming Read/Write steps
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.774Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.875Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.905Z: Unzipping flatten s11 for input s10.org.apache.beam.sdk.values.PCollection.<init>:402#5f2ef1f005ae0b4
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.931Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable1
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.967Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable0
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:17.999Z: Fusing consumer Read input/StripIds into Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.035Z: Fusing consumer Read co-input/StripIds into Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.077Z: Fusing consumer Collect start time metrics (co-input) into Read co-input/StripIds
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.117Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.156Z: Fusing consumer CoGroupByKey/MakeUnionTable1 into Window.Into()2/Window.Assign
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.188Z: Fusing consumer Collect start time metrics (input) into Read input/StripIds
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.218Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.264Z: Fusing consumer CoGroupByKey/MakeUnionTable0 into Window.Into()/Window.Assign
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.300Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.401Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn into CoGroupByKey/GBK/MergeBuckets
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.482Z: Fusing consumer Ungroup and reiterate into CoGroupByKey/ConstructCoGbkResultFn
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.534Z: Fusing consumer Collect total bytes into Ungroup and reiterate
Feb 06, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:18.572Z: Fusing consumer Collect end time metrics into Collect total bytes
Feb 06, 2021 12:23:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:19.027Z: Executing operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics
Feb 06, 2021 12:23:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:19.082Z: Executing operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 06, 2021 12:23:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:19.141Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 06, 2021 12:23:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:19.150Z: Starting 5 ****s in us-central1-f...
Feb 06, 2021 12:23:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:23:42.985Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 06, 2021 12:24:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:24:05.252Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 06, 2021 12:24:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:24:31.408Z: Workers have started successfully.
Feb 06, 2021 12:24:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T12:24:31.456Z: Workers have started successfully.
Feb 06, 2021 4:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:00.446Z: Cancel request is committed for workflow job: 2021-02-06_04_23_09-7813678601342489977.
Feb 06, 2021 4:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:00.474Z: Finished operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics
Feb 06, 2021 4:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:00.475Z: Finished operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 06, 2021 4:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:00.475Z: Finished operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 06, 2021 4:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:00.851Z: Cleaning up.
Feb 06, 2021 4:01:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:00.947Z: Stopping **** pool...
Feb 06, 2021 4:01:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:51.534Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Feb 06, 2021 4:01:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-06T16:01:51.596Z: Worker pool stopped.
Feb 06, 2021 4:01:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-02-06_04_23_09-7813678601342489977 finished with status CANCELLED.
Feb 06, 2021 4:01:59 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): af1d8989-f282-40fb-b67e-5db146c55444 and timestamp: 2021-02-06T12:23:03.374000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                 12803.003
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 39m 41s
90 actionable tasks: 55 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/pbg7c62ygiira

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org