You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/09/09 16:01:05 UTC

Build failed in Jenkins: beam_LoadTests_Java_ParDo_Dataflow_Streaming #844

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/844/display/redirect?page=changes>

Changes:

[suztomo] [BEAM-11205] Upgrading the Libraries BOM to v22

[randomstep] [BEAM-12708] Bump arrow-memory-netty

[Etienne Chauchot] [BEAM-12153] implement GroupByKey with CombinePerKey with Concatenate

[Etienne Chauchot] [BEAM-11023] Increase memory in SS Validates runner tests to avoid OOM

[vincent.marquez] [BEAM-9008] adds CassandraIO.readAll

[Etienne Chauchot] [BEAM-12727] extract Concatenate CombineFn to runner-core module to

[noreply] Add display data for JdbcIO.write (#15460)

[noreply] Merge pull request #15480: [BEAM-12356] Make sure DatasetService is

[noreply] [BEAM-11981] Java Bigtable - Implement IO Request Count metrics (#15342)

[noreply] [BEAM-12834] Improve Go SDK cross-language documentation and API.


------------------------------------------
[...truncated 502.06 KB...]
SEVERE: 2021-09-09T15:50:48.744Z: Workflow failed.
Sep 09, 2021 3:51:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:01.719Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:51:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:01.748Z: Workflow failed.
Sep 09, 2021 3:51:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:14.152Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:51:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:14.183Z: Workflow failed.
Sep 09, 2021 3:51:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:26.847Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:51:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:26.869Z: Workflow failed.
Sep 09, 2021 3:51:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:39.821Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:51:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:39.847Z: Workflow failed.
Sep 09, 2021 3:51:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:52.386Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:51:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:51:52.411Z: Workflow failed.
Sep 09, 2021 3:52:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:05.367Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:52:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:05.397Z: Workflow failed.
Sep 09, 2021 3:52:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:18.501Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:52:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:18.520Z: Workflow failed.
Sep 09, 2021 3:52:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:32.035Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:52:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:32.068Z: Workflow failed.
Sep 09, 2021 3:52:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:44.646Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:52:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:44.668Z: Workflow failed.
Sep 09, 2021 3:52:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:57.020Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:52:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:52:57.048Z: Workflow failed.
Sep 09, 2021 3:53:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:09.481Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:53:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:09.510Z: Workflow failed.
Sep 09, 2021 3:53:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:21.784Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:53:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:21.816Z: Workflow failed.
Sep 09, 2021 3:53:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:34.774Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:53:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:34.806Z: Workflow failed.
Sep 09, 2021 3:53:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:47.982Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:53:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:53:48.009Z: Workflow failed.
Sep 09, 2021 3:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:00.594Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:00.621Z: Workflow failed.
Sep 09, 2021 3:54:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:13.482Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:54:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:13.510Z: Workflow failed.
Sep 09, 2021 3:54:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:26.529Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:54:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:26.557Z: Workflow failed.
Sep 09, 2021 3:54:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:39.613Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:54:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:39.665Z: Workflow failed.
Sep 09, 2021 3:54:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:52.705Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:54:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:54:52.725Z: Workflow failed.
Sep 09, 2021 3:55:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:05.696Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:55:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:05.721Z: Workflow failed.
Sep 09, 2021 3:55:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:17.978Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:55:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:18.020Z: Workflow failed.
Sep 09, 2021 3:55:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:30.878Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:55:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:30.906Z: Workflow failed.
Sep 09, 2021 3:55:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:43.805Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:55:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:43.832Z: Workflow failed.
Sep 09, 2021 3:55:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:56.368Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:55:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:55:56.397Z: Workflow failed.
Sep 09, 2021 3:56:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:08.705Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:56:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:08.729Z: Workflow failed.
Sep 09, 2021 3:56:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:21.781Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:56:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:21.801Z: Workflow failed.
Sep 09, 2021 3:56:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:34.674Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:56:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:34.698Z: Workflow failed.
Sep 09, 2021 3:56:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:47.256Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:56:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:56:47.283Z: Workflow failed.
Sep 09, 2021 3:57:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:00.727Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:57:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:00.750Z: Workflow failed.
Sep 09, 2021 3:57:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:14.081Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:57:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:14.102Z: Workflow failed.
Sep 09, 2021 3:57:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:27.213Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:57:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:27.237Z: Workflow failed.
Sep 09, 2021 3:57:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:39.690Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:57:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:39.719Z: Workflow failed.
Sep 09, 2021 3:57:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:52.088Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:57:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:57:52.115Z: Workflow failed.
Sep 09, 2021 3:58:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:05.823Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:58:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:05.844Z: Workflow failed.
Sep 09, 2021 3:58:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:18.838Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:58:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:18.862Z: Workflow failed.
Sep 09, 2021 3:58:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:31.839Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:58:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:31.858Z: Workflow failed.
Sep 09, 2021 3:58:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:44.595Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:58:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:44.618Z: Workflow failed.
Sep 09, 2021 3:58:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:57.446Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:58:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:58:57.473Z: Workflow failed.
Sep 09, 2021 3:59:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:10.628Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:59:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:10.655Z: Workflow failed.
Sep 09, 2021 3:59:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:23.330Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:59:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:23.385Z: Workflow failed.
Sep 09, 2021 3:59:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:36.077Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:59:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:36.094Z: Workflow failed.
Sep 09, 2021 3:59:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:49.087Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 3:59:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T15:59:49.111Z: Workflow failed.
Sep 09, 2021 3:59:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-09T15:59:54.828Z: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/19493 instances, 20/12 CPUs, 2150/255161 disk GB, 0/2397 SSD disk GB, 1/275 instance groups, 1/278 managed instance groups, 1/504 instance templates, 5/722 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 09, 2021 4:00:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T16:00:01.868Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 4:00:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T16:00:01.897Z: Workflow failed.
Sep 09, 2021 4:00:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T16:00:15.492Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 4:00:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T16:00:15.515Z: Workflow failed.
Sep 09, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T16:00:28.499Z: Startup of the **** pool in zone us-central1-a failed to bring up any of the desired 5 ****s. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
Sep 09, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-09T16:00:28.531Z: Workflow failed.
Sep 09, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-09T16:00:32.403Z: Cancel request is committed for workflow job: 2021-09-09_05_14_37-14371174614367779468.
Sep 09, 2021 4:00:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-09T16:00:32.428Z: Finished operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+ParDo(TimeMonitor)+ParDo(ByteMonitor)+Step: 0+Step: 1+Step: 2+Step: 3+Step: 4+Step: 5+Step: 6+Step: 7+Step: 8+Step: 9+ParDo(TimeMonitor)2
Sep 09, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-09T16:00:34.353Z: Cleaning up.
Sep 09, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-09T16:00:34.447Z: Stopping **** pool...
Sep 09, 2021 4:00:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-09T16:00:55.695Z: Worker pool stopped.
Sep 09, 2021 4:01:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-09_05_14_37-14371174614367779468 finished with status CANCELLED.
Sep 09, 2021 4:01:03 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 2a1d093d-db6b-4bf7-94d4-17fe50e68fb7 and timestamp: 2021-09-09T12:14:32.468000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 47m 15s
90 actionable tasks: 56 executed, 34 from cache

Publishing build scan...

Publishing failed.

The build scan server appears to be unavailable.
Please check https://status.gradle.com for the latest service status.

If the service is reported as available, please report this problem via https://gradle.com/help/plugin and include the following via copy/paste:

----------
Gradle version: 6.8.3
Plugin version: 3.4.1
Request URL: https://status.gradle.com
Request ID: 5fee944c-55d7-4d40-b582-8f2578c36e78
Response status code: 405
Response server type: Varnish
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_ParDo_Dataflow_Streaming #849

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/849/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_ParDo_Dataflow_Streaming #848

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/848/display/redirect>

Changes:


------------------------------------------
[...truncated 7.52 KB...]
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:createCheckerFrameworkManifest
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :model:pipeline:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:run
Sep 13, 2021 12:12:12 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 13, 2021 12:12:13 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 13, 2021 12:12:13 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 13, 2021 12:12:14 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor)
Sep 13, 2021 12:12:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 13, 2021 12:12:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 13, 2021 12:12:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
Sep 13, 2021 12:12:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 1 seconds
Sep 13, 2021 12:12:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 13, 2021 12:12:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 13, 2021 12:12:20 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@618ff5c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16727bf0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f84acf7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@291373d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@372ca2d6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3204e238, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38ed139b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a5272be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58ba5b30, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4dba773d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d9bd4da, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c58255, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@eac3a26, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10b1a751, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53cf9c99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b306b9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@142213d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@934b52f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2630dbc4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ea4300e]
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Sep 13, 2021 12:12:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 13, 2021 12:12:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_12_21-17132838918776458649?project=apache-beam-testing
Sep 13, 2021 12:12:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-13_05_12_21-17132838918776458649
Sep 13, 2021 12:12:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-13_05_12_21-17132838918776458649
Sep 13, 2021 12:13:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-13T12:13:51.483Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0pardo01-jenkins-0913121-ku0j. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 13, 2021 12:13:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:13:57.784Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 13, 2021 12:14:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-13T12:13:58.663Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24386 instances, 20/0 CPUs, 2150/183716 disk GB, 0/2397 SSD disk GB, 1/288 instance groups, 1/291 managed instance groups, 1/517 instance templates, 5/615 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 13, 2021 12:14:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:13:58.698Z: Cleaning up.
Sep 13, 2021 12:14:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:13:58.760Z: Worker pool stopped.
Sep 13, 2021 12:14:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:13:59.936Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 13, 2021 12:14:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-13_05_12_21-17132838918776458649 failed with status FAILED.
Sep 13, 2021 12:14:03 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 2bf8ef5b-cde2-4880-8406-b50e0c34aa21 and timestamp: 2021-09-13T12:12:14.397000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 47s
90 actionable tasks: 56 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/5wwc66bc4wif4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_ParDo_Dataflow_Streaming #847

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/847/display/redirect>

Changes:


------------------------------------------
[...truncated 7.98 KB...]
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :model:fn-execution:extractProto
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :model:job-management:extractProto
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :model:pipeline:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :model:job-management:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:io:kinesis:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:run
Sep 12, 2021 12:09:38 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 12, 2021 12:09:40 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 12, 2021 12:09:40 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 12, 2021 12:09:41 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor)
Sep 12, 2021 12:09:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 12, 2021 12:09:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 12, 2021 12:09:44 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
Sep 12, 2021 12:09:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 0 seconds
Sep 12, 2021 12:09:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 12, 2021 12:09:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <101980 bytes, hash c348bed4d5769f23d7071b993d3769f3ad3e3917ba413b853836d7ed80a2c155> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-w0i-1NV2nyPXBxuZPTdp860-ORe6QTuFODbX7YCiwVU.pb
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 12, 2021 12:09:47 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b948f3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4c2cd4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77a074b4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@333c8791, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c0e13b7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22eaa86e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@561b7d53, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cc680e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dc3502b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a1d3225, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67e13bd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50fb33a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2cae9b8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1457fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f94fb9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17fa1336, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4228bf58, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68b9834c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20b9d5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@671d1157]
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Sep 12, 2021 12:09:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 12, 2021 12:09:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_09_47-12188882791043996342?project=apache-beam-testing
Sep 12, 2021 12:09:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-12_05_09_47-12188882791043996342
Sep 12, 2021 12:09:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-12_05_09_47-12188882791043996342
Sep 12, 2021 12:09:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-12T12:09:53.501Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0pardo01-jenkins-0912120-ht9r. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 12, 2021 12:09:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:09:58.793Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 12, 2021 12:10:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-12T12:09:59.549Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24378 instances, 20/0 CPUs, 2150/186841 disk GB, 0/2397 SSD disk GB, 1/287 instance groups, 1/290 managed instance groups, 1/516 instance templates, 5/607 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 12, 2021 12:10:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:09:59.588Z: Cleaning up.
Sep 12, 2021 12:10:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:09:59.634Z: Worker pool stopped.
Sep 12, 2021 12:10:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:10:00.804Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 12, 2021 12:10:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-12_05_09_47-12188882791043996342 failed with status FAILED.
Sep 12, 2021 12:10:06 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 2f84d964-71ea-4a8d-8a81-d3325fd2e049 and timestamp: 2021-09-12T12:09:40.945000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 16s
90 actionable tasks: 56 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/r6hcyivy3umbc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_ParDo_Dataflow_Streaming #846

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/846/display/redirect?page=changes>

Changes:

[noreply] Added type annotations to some combiners missing it. (#15414)

[noreply] [BEAM-12634] JmsIO auto scaling feature (#15464)

[noreply] [BEAM-12662] Get Flink version from cluster. (#15223)

[noreply] Port changes from Pub/Sub Lite to beam (#15418)

[heejong] [BEAM-12805] Fix XLang CombinePerKey test by explicitly assigning the

[BenWhitehead] [BEAM-8376] Google Cloud Firestore Connector - Add handling for

[noreply] Decreasing peak memory usage for beam.TupleCombineFn (#15494)

[noreply] [BEAM-12802] Add support for prefetch through data layers down through

[noreply] [BEAM-11097] Add implementation of side input cache (#15483)


------------------------------------------
[...truncated 7.99 KB...]
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:processResources
> Task :model:pipeline:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:jar
> Task :runners:core-construction-java:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:run
Sep 11, 2021 12:17:39 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 11, 2021 12:17:40 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 11, 2021 12:17:40 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 11, 2021 12:17:41 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor)
Sep 11, 2021 12:17:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 11, 2021 12:17:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 11, 2021 12:17:44 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
Sep 11, 2021 12:17:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 0 seconds
Sep 11, 2021 12:17:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 11, 2021 12:17:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <101980 bytes, hash f11b7ef250e091304c9cf19031706a3aea5f632e4c5d20946b5037aab4bb9e14> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8Rt-8lDgkTBMnPGQMXBqOupfYy5MXSCUa1A3qrS7nhQ.pb
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 11, 2021 12:17:46 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b948f3e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f4c2cd4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77a074b4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@333c8791, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c0e13b7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22eaa86e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@561b7d53, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cc680e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dc3502b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a1d3225, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67e13bd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50fb33a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2cae9b8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1457fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f94fb9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17fa1336, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4228bf58, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68b9834c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20b9d5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@671d1157]
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Sep 11, 2021 12:17:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 11, 2021 12:17:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_05_17_47-15190846134105635233?project=apache-beam-testing
Sep 11, 2021 12:17:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-11_05_17_47-15190846134105635233
Sep 11, 2021 12:17:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-11_05_17_47-15190846134105635233
Sep 11, 2021 12:17:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-11T12:17:51.554Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0pardo01-jenkins-0911121-owis. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 11, 2021 12:17:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:17:55.702Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 11, 2021 12:17:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-11T12:17:56.450Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24362 instances, 20/4 CPUs, 2150/186421 disk GB, 0/2397 SSD disk GB, 1/234 instance groups, 1/237 managed instance groups, 1/456 instance templates, 5/591 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 11, 2021 12:17:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:17:56.496Z: Cleaning up.
Sep 11, 2021 12:17:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:17:56.587Z: Worker pool stopped.
Sep 11, 2021 12:17:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-11T12:17:57.890Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 11, 2021 12:18:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-11_05_17_47-15190846134105635233 failed with status FAILED.
Sep 11, 2021 12:18:02 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 727b14c9-b3fc-4541-9a75-bd1a8e1f45d2 and timestamp: 2021-09-11T12:17:41.205000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57s
90 actionable tasks: 56 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/ua5ts4spkbyc2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_ParDo_Dataflow_Streaming #845

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/845/display/redirect?page=changes>

Changes:

[ruwan.lambrichts] Clarify additional_bq_parameters argument

[kawaigin] [BEAM-10708] Support streaming cache in beam_sql magic

[noreply] Fix broken 'differences from pandas' link

[noreply] Added GroupBy row in Aggregation table.

[Luke Cwik] [BEAM-12769] Fix typo in test class name, CLass -> Class

[Etienne Chauchot] [BEAM-5172] Temporary ignore testSplit and testSizes tests waiting for a

[samuelw] [BEAM-12740] Remove matching to filter files when renaming gcs files in

[noreply] [BEAM-3304] Helper functions for triggers (#15430)

[esert] Bump a throttling counter on BigQueryRead retries due to

[noreply] [BEAM-5097] Increment counter for "small words" in go SDK example

[noreply] Register MapCoder, some comments/cleanup. (#15471)

[noreply] [BEAM-12588] Multimap user state proto changes (#15473)


------------------------------------------
[...truncated 9.71 KB...]
> Task :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :model:job-management:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:io:synthetic:compileJava
Note: <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
Sep 10, 2021 12:30:57 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
Sep 10, 2021 12:30:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 10, 2021 12:30:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged.
Sep 10, 2021 12:30:59 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor)
Sep 10, 2021 12:30:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Sep 10, 2021 12:31:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 196 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Sep 10, 2021 12:31:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-h_pN1a12Kq6Q7UgG0NwuMuy2KLX9eKEV-XpQQIHwHqY.jar
Sep 10, 2021 12:31:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 196 files cached, 0 files newly uploaded in 0 seconds
Sep 10, 2021 12:31:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 10, 2021 12:31:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <101986 bytes, hash 9c0abb6b7b602273f148c599e79151ec6272d04692c0d32bf3bd730d1ee47c32> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-nAq7a3tgInPxSMWZ55FR7GJy0EaSwNMr871zDR7kfDI.pb
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Sep 10, 2021 12:31:05 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77a074b4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@333c8791, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c0e13b7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22eaa86e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@561b7d53, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cc680e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1dc3502b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a1d3225, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67e13bd0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50fb33a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2cae9b8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1457fde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f94fb9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17fa1336, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4228bf58, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68b9834c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20b9d5d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@671d1157, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60c8a093, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44cffc25]
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Sep 10, 2021 12:31:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 10, 2021 12:31:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-10_05_31_05-18109980974285651161?project=apache-beam-testing
Sep 10, 2021 12:31:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-10_05_31_05-18109980974285651161
Sep 10, 2021 12:31:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-10_05_31_05-18109980974285651161
Sep 10, 2021 12:31:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-10T12:31:12.783Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0pardo01-jenkins-0910123-5bgp. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 10, 2021 12:31:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:31:19.463Z: Worker configuration: e2-standard-4 in us-central1-a.
Sep 10, 2021 12:31:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-10T12:31:20.114Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/24297 instances, 20/0 CPUs, 2150/218301 disk GB, 0/2397 SSD disk GB, 1/192 instance groups, 1/195 managed instance groups, 1/419 instance templates, 5/526 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
Sep 10, 2021 12:31:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:31:20.158Z: Cleaning up.
Sep 10, 2021 12:31:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:31:20.201Z: Worker pool stopped.
Sep 10, 2021 12:31:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-10T12:31:21.372Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 10, 2021 12:31:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-10_05_31_05-18109980974285651161 failed with status FAILED.
Sep 10, 2021 12:31:27 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 3c09e23e-8e83-4acb-80af-ded3db152bb5 and timestamp: 2021-09-10T12:30:59.237000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
	at org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 22s
90 actionable tasks: 58 executed, 32 from cache

Publishing build scan...
https://gradle.com/s/6zwxedrlwnvle

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org