You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/09/10 12:28:44 UTC

Build failed in Jenkins: beam_LoadTests_Java_Combine_Dataflow_Streaming #852

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_Dataflow_Streaming/852/display/redirect?page=changes>

Changes:

[ruwan.lambrichts] Clarify additional_bq_parameters argument

[kawaigin] [BEAM-10708] Support streaming cache in beam_sql magic

[noreply] Fix broken 'differences from pandas' link

[noreply] Added GroupBy row in Aggregation table.

[Luke Cwik] [BEAM-12769] Fix typo in test class name, CLass -> Class

[Etienne Chauchot] [BEAM-5172] Temporary ignore testSplit and testSizes tests waiting for a

[samuelw] [BEAM-12740] Remove matching to filter files when renaming gcs files in

[noreply] [BEAM-3304] Helper functions for triggers (#15430)

[esert] Bump a throttling counter on BigQueryRead retries due to

[noreply] [BEAM-5097] Increment counter for "small words" in go SDK example

[noreply] Register MapCoder, some comments/cleanup. (#15471)

[noreply] [BEAM-12588] Multimap user state proto changes (#15473)


------------------------------------------
[...truncated 7.96 KB...]
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :model:job-management:createCheckerFrameworkManifest
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:core:processResources
> Task :runners:google-cloud-dataflow-java:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :model:job-management:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:io:kinesis:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar

> Task :sdks:java:testing:load-tests:run
Sep 10, 2021 12:28:19 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 10, 2021 12:28:19 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 10, 2021 12:28:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 10, 2021 12:28:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 10, 2021 12:28:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 10, 2021 12:28:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-h_pN1a12Kq6Q7UgG0NwuMuy2KLX9eKEV-XpQQIHwHqY.jar
Sep 10, 2021 12:28:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 0 seconds
Sep 10, 2021 12:28:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 10, 2021 12:28:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <94277 bytes, hash 6fb47399af259ea86e36f97024f11d15dbc8610cc963f9df73a51ae4f9d10845> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-b7Rzma8lnqhuNvlwJPEdFdvIYQzJY_nfc6Ua5PnRCEU.pb
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 10, 2021 12:28:25 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a818392, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@489091bd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512d6e60, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1de9b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b122839, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3743539f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d277579, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5db6b845, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@378f002a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1afd72ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2cc75074, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@445bb139, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b9a77c8]
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metric as step s3
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect metrics as step s4
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 0/Map as step s6
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/GroupByKey as step s7
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/Combine.GroupedValues as step s8
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric as step s9
Sep 10, 2021 12:28:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 10, 2021 12:28:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-10_05_28_25-18367361361567395324?project=apache-beam-testing
Sep 10, 2021 12:28:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-10_05_28_25-18367361361567395324
Sep 10, 2021 12:28:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-10_05_28_25-18367361361567395324
Sep 10, 2021 12:28:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-10T12:28:30.954Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0combine01-jenkins-09101-xlk8. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 10, 2021 12:28:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:28:35.935Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 10, 2021 12:28:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-10T12:28:36.649Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24314 instances, 20/14 CPUs, 2150/217676 disk GB, 0/2397 SSD disk GB, 1/186 instance groups, 1/191 managed instance groups, 1/411 instance templates, 5/543 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 10, 2021 12:28:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:28:36.679Z: Cleaning up.
Sep 10, 2021 12:28:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:28:36.736Z: Worker pool stopped.
Sep 10, 2021 12:28:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:28:37.954Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 10, 2021 12:28:41 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-10_05_28_25-18367361361567395324 failed with status FAILED.
Sep 10, 2021 12:28:41 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace combine
Load test results for test (ID): 897ba1b1-d5a0-4c76-85e9-0739604ab884 and timestamp: 2021-09-10T12:28:20.645000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:172)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 43s
90 actionable tasks: 60 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/mworosvtvzhxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_Combine_Dataflow_Streaming #856

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_Dataflow_Streaming/856/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Dataflow_Streaming #855

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_Dataflow_Streaming/855/display/redirect>

Changes:


------------------------------------------
[...truncated 4.77 KB...]
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:processResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:io:kinesis:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:run
Sep 13, 2021 12:10:39 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 13, 2021 12:10:40 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 13, 2021 12:10:40 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 13, 2021 12:10:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 13, 2021 12:10:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 13, 2021 12:10:45 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
Sep 13, 2021 12:10:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 0 seconds
Sep 13, 2021 12:10:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 13, 2021 12:10:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <94271 bytes, hash ab745d38fd08ff0debeeb6e75de1caba977c3810f5cba73e62893ac1c43c4e09> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-q3RdOP0I_w3r7rbnXeHKupd8OBD1y6c-Yok6wcQ8Tgk.pb
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 13, 2021 12:10:48 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f1bb92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a818392, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@489091bd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512d6e60, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1de9b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b122839, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3743539f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d277579, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5db6b845, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@378f002a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1afd72ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2cc75074, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@445bb139]
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metric as step s3
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect metrics as step s4
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 0/Map as step s6
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/GroupByKey as step s7
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/Combine.GroupedValues as step s8
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric as step s9
Sep 13, 2021 12:10:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 13, 2021 12:10:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_10_48-9869641847693921516?project=apache-beam-testing
Sep 13, 2021 12:10:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-13_05_10_48-9869641847693921516
Sep 13, 2021 12:10:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-13_05_10_48-9869641847693921516
Sep 13, 2021 12:10:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-13T12:10:57.137Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0combine01-jenkins-09131-rhkj. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 13, 2021 12:11:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:11:03.551Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 13, 2021 12:11:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-13T12:11:04.168Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24386 instances, 20/0 CPUs, 2150/183716 disk GB, 0/2397 SSD disk GB, 1/288 instance groups, 1/291 managed instance groups, 1/517 instance templates, 5/615 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 13, 2021 12:11:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:11:04.203Z: Cleaning up.
Sep 13, 2021 12:11:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:11:04.254Z: Worker pool stopped.
Sep 13, 2021 12:11:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:11:05.450Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 13, 2021 12:11:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-13_05_10_48-9869641847693921516 failed with status FAILED.
Sep 13, 2021 12:11:10 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace combine
Load test results for test (ID): 03b2c83f-da5a-4d92-b805-b8d4608d78e6 and timestamp: 2021-09-13T12:10:41.398000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:172)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 53s
90 actionable tasks: 56 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/hq4mahxj2vglk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Dataflow_Streaming #854

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_Dataflow_Streaming/854/display/redirect>

Changes:


------------------------------------------
[...truncated 4.77 KB...]
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:jar
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:run
Sep 12, 2021 12:08:29 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 12, 2021 12:08:31 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 12, 2021 12:08:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 12, 2021 12:08:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 12, 2021 12:08:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 12, 2021 12:08:36 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
Sep 12, 2021 12:08:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 2 seconds
Sep 12, 2021 12:08:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 12, 2021 12:08:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <94271 bytes, hash 4063bab0c6c9f389ad3d07d8e47c20eb0a0a130c31017b2578cbebf152b9b7c0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QGO6sMbJ84mtPQfY5Hwg6woKEwwxAXsleMvr8VK5t8A.pb
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 12, 2021 12:08:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a818392, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@489091bd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512d6e60, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1de9b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b122839, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3743539f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d277579, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5db6b845, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@378f002a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1afd72ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2cc75074, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@445bb139, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b9a77c8]
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metric as step s3
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect metrics as step s4
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 0/Map as step s6
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/GroupByKey as step s7
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/Combine.GroupedValues as step s8
Sep 12, 2021 12:08:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric as step s9
Sep 12, 2021 12:08:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 12, 2021 12:08:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_08_42-7471290028464055433?project=apache-beam-testing
Sep 12, 2021 12:08:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-12_05_08_42-7471290028464055433
Sep 12, 2021 12:08:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-12_05_08_42-7471290028464055433
Sep 12, 2021 12:08:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-12T12:08:47.127Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0combine01-jenkins-09121-6t9d. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 12, 2021 12:08:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:08:52.234Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 12, 2021 12:08:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-12T12:08:52.977Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24378 instances, 20/0 CPUs, 2150/186841 disk GB, 0/2397 SSD disk GB, 1/287 instance groups, 1/290 managed instance groups, 1/516 instance templates, 5/607 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 12, 2021 12:08:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:08:53.014Z: Cleaning up.
Sep 12, 2021 12:08:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:08:53.066Z: Worker pool stopped.
Sep 12, 2021 12:08:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:08:54.289Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 12, 2021 12:08:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-12_05_08_42-7471290028464055433 failed with status FAILED.
Sep 12, 2021 12:08:57 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace combine
Load test results for test (ID): b51b73e3-16e7-4e3a-a92c-990fd4ea6ddc and timestamp: 2021-09-12T12:08:32.508000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:172)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 22s
90 actionable tasks: 56 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/igrpszzmbdjpu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Dataflow_Streaming #853

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_Dataflow_Streaming/853/display/redirect?page=changes>

Changes:

[noreply] Added type annotations to some combiners missing it. (#15414)

[noreply] [BEAM-12634] JmsIO auto scaling feature (#15464)

[noreply] [BEAM-12662] Get Flink version from cluster. (#15223)

[noreply] Port changes from Pub/Sub Lite to beam (#15418)

[heejong] [BEAM-12805] Fix XLang CombinePerKey test by explicitly assigning the

[BenWhitehead] [BEAM-8376] Google Cloud Firestore Connector - Add handling for

[noreply] Decreasing peak memory usage for beam.TupleCombineFn (#15494)

[noreply] [BEAM-12802] Add support for prefetch through data layers down through

[noreply] [BEAM-11097] Add implementation of side input cache (#15483)


------------------------------------------
[...truncated 6.57 KB...]
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:processResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :model:job-management:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:io:synthetic:compileJava
Note: <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_Dataflow_Streaming/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
Sep 11, 2021 12:15:57 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 11, 2021 12:15:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 11, 2021 12:15:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 11, 2021 12:15:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 11, 2021 12:16:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 11, 2021 12:16:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
Sep 11, 2021 12:16:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 0 seconds
Sep 11, 2021 12:16:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 11, 2021 12:16:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <94271 bytes, hash 234aa206475931f053251d207da0d70ac8da1123cda124d1635c16e27b54da63> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-I0qiBkdZMfBTJR0gfaDXCsjaESPNoSTRY1wW4ntU2mM.pb
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 11, 2021 12:16:06 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f1bb92, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6d6bbd35, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5c5d6175, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7544ac86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b27b497, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b1534d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c74aa0d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c841199, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a818392, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@489091bd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512d6e60, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1de9b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b122839, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3743539f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d277579, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5db6b845, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@378f002a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1afd72ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2cc75074, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@445bb139]
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metric as step s3
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect metrics as step s4
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 0/Map as step s6
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/GroupByKey as step s7
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/Combine.GroupedValues as step s8
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric as step s9
Sep 11, 2021 12:16:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 11, 2021 12:16:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_05_16_06-13518322632204788697?project=apache-beam-testing
Sep 11, 2021 12:16:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-11_05_16_06-13518322632204788697
Sep 11, 2021 12:16:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-11_05_16_06-13518322632204788697
Sep 11, 2021 12:16:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-11T12:16:11.980Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0combine01-jenkins-09111-6gtu. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 11, 2021 12:16:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:16:17.825Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 11, 2021 12:16:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-11T12:16:18.460Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24360 instances, 20/2 CPUs, 2150/187396 disk GB, 0/2397 SSD disk GB, 1/239 instance groups, 1/242 managed instance groups, 1/468 instance templates, 5/589 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 11, 2021 12:16:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:16:18.489Z: Cleaning up.
Sep 11, 2021 12:16:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:16:18.540Z: Worker pool stopped.
Sep 11, 2021 12:16:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:16:19.695Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 11, 2021 12:16:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-11_05_16_06-13518322632204788697 failed with status FAILED.
Sep 11, 2021 12:16:25 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace combine
Load test results for test (ID): 5aec5624-83cb-45a3-982e-566a68dd53d2 and timestamp: 2021-09-11T12:15:59.219000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:172)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 36s
90 actionable tasks: 59 executed, 31 from cache

Publishing build scan...
https://gradle.com/s/iwthhloge63f2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org