You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/09/11 02:01:15 UTC
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8373
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8373/display/redirect?page=changes>
Changes:
[noreply] Port changes from Pub/Sub Lite to beam (#15418)
[heejong] [BEAM-12805] Fix XLang CombinePerKey test by explicitly assigning the
[BenWhitehead] [BEAM-8376] Google Cloud Firestore Connector - Add handling for
[noreply] Decreasing peak memory usage for beam.TupleCombineFn (#15494)
[noreply] [BEAM-12802] Add support for prefetch through data layers down through
[noreply] [BEAM-11097] Add implementation of side input cache (#15483)
------------------------------------------
[...truncated 4.20 MB...]
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:10.602Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:10.628Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:23.233Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:23.256Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:36.100Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:36.115Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:49.004Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:37:49.031Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:02.187Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:02.213Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:15.648Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:15.673Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:27.960Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:27.979Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:40.578Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:40.607Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:53.748Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:38:53.770Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:06.724Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:06.740Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:19.676Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:19.704Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:32.298Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:32.325Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:45.564Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:45.595Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:58.342Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:39:58.367Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:11.249Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:11.277Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:24.323Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:24.348Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:36.905Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:36.938Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:49.239Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:40:49.260Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:01.936Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:01.959Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:14.925Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:14.949Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:27.896Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:27.925Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:40.976Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:41Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:53.258Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:41:53.287Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:06.504Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:06.535Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:18.800Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:18.819Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:31.442Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:31.470Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:44.796Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:44.821Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:57.569Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:42:57.594Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:10.048Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:10.070Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:23.197Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:23.226Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:35.363Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:35.389Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:48.430Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:43:48.450Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:00.912Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:00.937Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:13.755Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:13.792Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:26.442Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:26.460Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:38.956Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:38.972Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:51.627Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:44:51.651Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:04.032Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:04.058Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:16.887Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:16.908Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:29.987Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:30.010Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:42.183Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:42.208Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:54.997Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:45:55.018Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:07.821Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:07.853Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:20.632Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:20.657Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:32.891Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:32.910Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:45.788Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:45.813Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:58.633Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:46:58.655Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:10.839Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:10.871Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:23.997Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:24.016Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:36.340Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:36.362Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:49.234Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:47:49.259Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:02.193Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:02.216Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:14.798Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:14.814Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:27.700Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:27.718Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:40.683Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:40.758Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:53.697Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:48:53.726Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:06.132Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:06.153Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:18.666Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:18.689Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:31.464Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:31.489Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:44.077Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:44.096Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:56.415Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:49:56.439Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:09.039Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:09.055Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.831Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2021-09-10_17_35_30-6385344182709365556.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation Start 2/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Start 2/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Start 2/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Start 2/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Start 2/Map(decode)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation Start 3/Impulse+Start 3/FlatMap(<lambda at core.py:2965>)+Start 3/MaybeReshuffle/Reshuffle/AddRandomKeys+Start 3/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Start 3/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation Start 1/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Start 1/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Start 1/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Start 1/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Start 1/Map(decode)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation Start 1/Impulse+Start 1/FlatMap(<lambda at core.py:2965>)+Start 1/MaybeReshuffle/Reshuffle/AddRandomKeys+Start 1/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Start 1/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation Start 3/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Start 3/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Start 3/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Start 3/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Start 3/Map(decode)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2965>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.853Z: JOB_MESSAGE_BASIC: Finished operation Start 2/Impulse+Start 2/FlatMap(<lambda at core.py:2965>)+Start 2/MaybeReshuffle/Reshuffle/AddRandomKeys+Start 2/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Start 2/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:15.940Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:16.002Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T01:50:16.024Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-09-10_17_35_30-6385344182709365556 is in state JOB_STATE_CANCELLING
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m======== 15 failed, 14 passed, 3 skipped, 8 warnings in 5210.32 seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 0m 40s
92 actionable tasks: 64 executed, 28 from cache
Publishing build scan...
https://gradle.com/s/qbvwrjbaeean6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow
#8385
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8385/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8384
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8384/display/redirect?page=changes>
Changes:
[rohde.samuel] [BEAM-12842] Add timestamp to test work item to deflake
[suztomo] [BEAM-12873] HL7v2IO: to leave schematizedData null, not empty
------------------------------------------
[...truncated 2.34 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913181637395747-3550'
createTime: '2021-09-13T18:16:45.557152Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-13_11_16_44-5357630715320491649'
location: 'us-central1'
name: 'beamapp-jenkins-0913181637-393752'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T18:16:45.557152Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-13_11_16_44-5357630715320491649]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-13_11_16_44-5357630715320491649
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_11_16_44-5357630715320491649?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m__________________ SideInputsTest.test_reiterable_side_input ___________________[0m
[gw3] linux -- Python 3.6.8 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6>
self = <apache_beam.transforms.sideinputs_test.SideInputsTest testMethod=test_reiterable_side_input>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reiterable_side_input(self):[0m
[1m expected_side = frozenset(range(100))[0m
[1m [0m
[1m def check_reiteration(main, side):[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate a second time.[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate over two copies of the input at the same time.[0m
[1m both = zip(side, side)[0m
[1m first, second = zip(*both)[0m
[1m assert expected_side == set(first), first[0m
[1m assert expected_side == set(second), second[0m
[1m # This will iterate over two copies of the side input, but offset.[0m
[1m offset = [None] * (len(expected_side) // 2)[0m
[1m both = zip(itertools.chain(side, offset), itertools.chain(offset, side))[0m
[1m first, second = zip(*both)[0m
[1m expected_and_none = frozenset.union(expected_side, [None])[0m
[1m assert expected_and_none == set(first), first[0m
[1m assert expected_and_none == set(second), second[0m
[1m [0m
[1m pipeline = self.create_pipeline()[0m
[1m pcol = pipeline | 'start' >> beam.Create(['A', 'B'])[0m
[1m side = pipeline | 'side' >> beam.Create(expected_side)[0m
[1m _ = pcol | 'check' >> beam.Map(check_reiteration, beam.pvalue.AsIter(side))[0m
[1m> pipeline.run()[0m
[1m[31mapache_beam/transforms/sideinputs_test.py[0m:220:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7f83aad18c18>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913181639718234-3550'
createTime: '2021-09-13T18:16:47.729962Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-13_11_16_46-11063527506877281827'
location: 'us-central1'
name: 'beamapp-jenkins-0913181639-716480'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T18:16:47.729962Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-13_11_16_46-11063527506877281827]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-13_11_16_46-11063527506877281827
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_11_16_46-11063527506877281827?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py36-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 210.65 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 16m 23s
92 actionable tasks: 63 executed, 29 from cache
Publishing build scan...
https://gradle.com/s/nl6ak6r4bjbta
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8383
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8383/display/redirect>
Changes:
------------------------------------------
[...truncated 2.17 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_10_13-17203889701883546549?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw2] linux -- Python 3.6.8 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7fb11e0b5198>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.15.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.15.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow-worker.jar in 3 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913121010252407-3550'
createTime: '2021-09-13T12:10:20.424677Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-13_05_10_15-13873130178695542247'
location: 'us-central1'
name: 'beamapp-jenkins-0913121010-251025'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T12:10:20.424677Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-13_05_10_15-13873130178695542247]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-13_05_10_15-13873130178695542247
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_10_15-13873130178695542247?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py36-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 162.40 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 10m 14s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/ioygd5z7coubc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8382
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8382/display/redirect>
Changes:
------------------------------------------
[...truncated 2.63 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_23_16_07-5737291489510279557?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw0] linux -- Python 3.7.3 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7ff42e7b4a90>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.15.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.15.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913061602180174-3550'
createTime: '2021-09-13T06:16:09.247582Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_23_16_08-7973199312937205633'
location: 'us-central1'
name: 'beamapp-jenkins-0913061602-178589'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T06:16:09.247582Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_23_16_08-7973199312937205633]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_23_16_08-7973199312937205633
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_23_16_08-7973199312937205633?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 127.33 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 15m 50s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/xbzhzwjmpdjjg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8381
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8381/display/redirect>
Changes:
------------------------------------------
[...truncated 2.32 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-1.3.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-1.3.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow-worker.jar in 3 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913001745818666-3550'
createTime: '2021-09-13T00:17:52.182401Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_17_17_51-3112782731391603756'
location: 'us-central1'
name: 'beamapp-jenkins-0913001745-817515'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T00:17:52.182401Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_17_17_51-3112782731391603756]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_17_17_51-3112782731391603756
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_17_17_51-3112782731391603756?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m__________________ SideInputsTest.test_reiterable_side_input ___________________[0m
[gw5] linux -- Python 3.7.3 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.transforms.sideinputs_test.SideInputsTest testMethod=test_reiterable_side_input>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reiterable_side_input(self):[0m
[1m expected_side = frozenset(range(100))[0m
[1m [0m
[1m def check_reiteration(main, side):[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate a second time.[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate over two copies of the input at the same time.[0m
[1m both = zip(side, side)[0m
[1m first, second = zip(*both)[0m
[1m assert expected_side == set(first), first[0m
[1m assert expected_side == set(second), second[0m
[1m # This will iterate over two copies of the side input, but offset.[0m
[1m offset = [None] * (len(expected_side) // 2)[0m
[1m both = zip(itertools.chain(side, offset), itertools.chain(offset, side))[0m
[1m first, second = zip(*both)[0m
[1m expected_and_none = frozenset.union(expected_side, [None])[0m
[1m assert expected_and_none == set(first), first[0m
[1m assert expected_and_none == set(second), second[0m
[1m [0m
[1m pipeline = self.create_pipeline()[0m
[1m pcol = pipeline | 'start' >> beam.Create(['A', 'B'])[0m
[1m side = pipeline | 'side' >> beam.Create(expected_side)[0m
[1m _ = pcol | 'check' >> beam.Map(check_reiteration, beam.pvalue.AsIter(side))[0m
[1m> pipeline.run()[0m
[1m[31mapache_beam/transforms/sideinputs_test.py[0m:220:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7f8fc99d6e48>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-1.3.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-1.3.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow-worker.jar in 3 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913001743487351-3550'
createTime: '2021-09-13T00:17:50.073629Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_17_17_49-16632517925545001729'
location: 'us-central1'
name: 'beamapp-jenkins-0913001743-486227'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T00:17:50.073629Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_17_17_49-16632517925545001729]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_17_17_49-16632517925545001729
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_17_17_49-16632517925545001729?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 101.46 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 17m 36s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/5zkw2ijjlyens
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8380
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8380/display/redirect>
Changes:
------------------------------------------
[...truncated 2.47 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_11_10_50-2277753668293080389?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw3] linux -- Python 3.7.3 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7f64dfcda518>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.15.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.15.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210912181058251599-3550'
createTime: '2021-09-12T18:11:05.242398Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_11_11_04-6391030904901470593'
location: 'us-central1'
name: 'beamapp-jenkins-0912181058-249687'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-12T18:11:05.242398Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_11_11_04-6391030904901470593]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_11_11_04-6391030904901470593
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_11_11_04-6391030904901470593?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 146.88 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 10m 52s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/kzzi2dz6lvo44
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8379
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8379/display/redirect>
Changes:
------------------------------------------
[...truncated 2.17 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_13_02-6252461799824349473?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw0] linux -- Python 3.6.8 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7fd6a75d7cc0>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210912121258069056-3550'
createTime: '2021-09-12T12:13:07.246978Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_05_13_05-10764178939461324569'
location: 'us-central1'
name: 'beamapp-jenkins-0912121258-043875'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-12T12:13:07.246978Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_05_13_05-10764178939461324569]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_05_13_05-10764178939461324569
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_13_05-10764178939461324569?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py36-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 201.03 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 12m 49s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/msity5levn6qc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8378
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8378/display/redirect>
Changes:
------------------------------------------
[...truncated 2.98 MB...]
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:28.384Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:41.868Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:41.886Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:54.516Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:54.539Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:07.340Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:07.360Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:20.968Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:20.997Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:33.804Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:33.827Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:46.449Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:46.470Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:59.913Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:59.935Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:13.359Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:13.379Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:26.811Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:26.833Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:39.402Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:39.420Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:52.021Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:52.044Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:05.637Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:05.662Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:18.842Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:18.862Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:32.529Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:32.545Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:46.353Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:46.370Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:58.873Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:58.889Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:12.485Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:12.502Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:25.778Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:25.804Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:39.114Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:39.135Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:52.952Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:52.972Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:05.796Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:05.821Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:17.962Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:17.982Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:30.732Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:30.754Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:43.857Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:43.877Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:57.025Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:57.044Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:10.203Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:10.245Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:23.670Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:23.694Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:37.525Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:37.541Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:51.164Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:51.184Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:03.926Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:03.945Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:16.870Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:16.889Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:29.068Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:29.091Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:42.027Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:42.053Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:54.951Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:54.975Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:08.114Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:08.137Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:21.714Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:21.737Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:35.078Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:35.098Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:47.731Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:47.752Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:00.491Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:00.508Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:13.401Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:13.424Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:26.585Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:26.612Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:39.605Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:39.622Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:52.819Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:52.836Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:06.616Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:06.642Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:19.899Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:19.920Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:32.530Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:32.548Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:45.582Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:45.609Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:59.078Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:59.093Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:12.811Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:12.836Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:26.510Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:26.536Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:39.416Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:39.434Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:52.595Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:52.632Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:06.920Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:06.944Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:20.006Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:20.033Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:33.953Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:33.973Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:47.337Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:47.359Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:00.055Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:00.077Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:13.480Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:13.508Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:26.655Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:26.679Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:39.868Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:39.888Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:52.657Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:52.677Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:05.606Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:05.628Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:18.563Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:18.583Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:31.627Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:31.650Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:45.086Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:45.108Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:58.288Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:58.308Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:11.201Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:11.224Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:24.251Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:24.290Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.317Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2021-09-11_23_18_52-6940385899211472512.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Some Numbers/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Some Numbers/Map(decode)+ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:256>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream+assert:odd/WindowInto(WindowIntoFn)+assert:odd/ToVoidKey+assert:odd/Group/CoGroupByKeyImpl/Tag[1]+assert:odd/Group/CoGroupByKeyImpl/GroupByKey/WriteStream+assert:even/WindowInto(WindowIntoFn)+assert:even/ToVoidKey+assert:even/Group/CoGroupByKeyImpl/Tag[1]+assert:even/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation assert:even/Create/Impulse+assert:even/Create/FlatMap(<lambda at core.py:2965>)+assert:even/Create/Map(decode)+assert:even/Group/CoGroupByKeyImpl/Tag[0]+assert:even/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation assert:even/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert:even/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert:even/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert:even/Group/RestoreTags+assert:even/Unkey+assert:even/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2965>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation assert:odd/Create/Impulse+assert:odd/Create/FlatMap(<lambda at core.py:2965>)+assert:odd/Create/Map(decode)+assert:odd/Group/CoGroupByKeyImpl/Tag[0]+assert:odd/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation Some Numbers/Impulse+Some Numbers/FlatMap(<lambda at core.py:2965>)+Some Numbers/MaybeReshuffle/Reshuffle/AddRandomKeys+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation assert:odd/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert:odd/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert:odd/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert:odd/Group/RestoreTags+assert:odd/Unkey+assert:odd/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.433Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.482Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.503Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-09-11_23_18_52-6940385899211472512 is in state JOB_STATE_CANCELLING
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m============== 25 failed, 4 passed, 3 skipped in 5180.95 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 43m 41s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/qca3daph5aztm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8377
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8377/display/redirect>
Changes:
------------------------------------------
[...truncated 6.13 MB...]
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:35:37.287Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:35:37.306Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:35:49.606Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:35:49.628Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:02.016Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:02.032Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:14.833Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:14.855Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:27.101Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:27.125Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:39.486Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:39.507Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:52.368Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:36:52.389Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:04.474Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:04.495Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:17.127Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:17.142Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:29.392Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:29.410Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:42.343Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:42.363Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:54.842Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:37:54.866Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:07.548Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:07.569Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:20.697Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:20.714Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:33.615Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:33.633Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:46.143Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:46.159Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:58.646Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:38:58.667Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:10.988Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:11.012Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:23.543Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:23.562Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:35.872Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:35.896Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:47.833Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:39:47.852Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:00.586Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:00.600Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:13.264Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:13.289Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:25.262Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:25.277Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:37.296Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:37.311Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:50.065Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:40:50.089Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:02.852Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:02.867Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:15.579Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:15.596Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:28.009Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:28.025Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:40.918Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:40.939Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:53.533Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:41:53.555Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:06.562Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:06.582Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:19.140Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:19.162Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:31.696Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:31.715Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:45.364Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:45.394Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:57.299Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:42:57.320Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:10.160Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:10.185Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:22.623Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:22.645Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:34.834Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:34.853Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:47.616Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:47.637Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:43:59.982Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:00.006Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:12.329Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:12.348Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:24.980Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:25Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:37.388Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:37.408Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:50.032Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:44:50.058Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:02.359Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:02.382Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:15.143Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:15.166Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:28.222Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:28.245Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:41.224Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:41.246Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:54.844Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:45:54.863Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:07.610Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:07.631Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:20.364Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:20.390Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:32.704Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:32.723Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:45.238Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:45.255Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:58.181Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:46:58.196Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:11.179Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:11.192Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:23.496Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:23.517Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:35.881Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:35.907Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:48.757Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:47:48.777Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:01.311Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:01.326Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:14.228Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:14.243Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:26.487Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:26.506Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:39.393Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:39.415Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:52.045Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:48:52.067Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:05.419Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:05.437Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:17.859Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:17.882Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:30.874Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:30.888Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:43.453Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T02:49:43.477Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-09-11_18_34_56-3601821892704418557 is in state JOB_STATE_CANCELLING
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m======== 19 failed, 10 passed, 3 skipped, 8 warnings in 9101.68 seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 49m 44s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/ifukkx5nyvvcm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8376
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8376/display/redirect>
Changes:
------------------------------------------
[...truncated 2.59 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/dataflow-worker.jar in 3 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182303-392161.1631384583.392311/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210911182303393370-3550'
createTime: '2021-09-11T18:23:10.260974Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-11_11_23_09-13687060579801032647'
location: 'us-central1'
name: 'beamapp-jenkins-0911182303-392161'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-11T18:23:10.260974Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-11_11_23_09-13687060579801032647]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-11_11_23_09-13687060579801032647
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:823 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_11_23_09-13687060579801032647?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m__________________ SideInputsTest.test_reiterable_side_input ___________________[0m
[gw0] linux -- Python 3.8.5 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/bin/python3.8>
self = <apache_beam.transforms.sideinputs_test.SideInputsTest testMethod=test_reiterable_side_input>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reiterable_side_input(self):[0m
[1m expected_side = frozenset(range(100))[0m
[1m [0m
[1m def check_reiteration(main, side):[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate a second time.[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate over two copies of the input at the same time.[0m
[1m both = zip(side, side)[0m
[1m first, second = zip(*both)[0m
[1m assert expected_side == set(first), first[0m
[1m assert expected_side == set(second), second[0m
[1m # This will iterate over two copies of the side input, but offset.[0m
[1m offset = [None] * (len(expected_side) // 2)[0m
[1m both = zip(itertools.chain(side, offset), itertools.chain(offset, side))[0m
[1m first, second = zip(*both)[0m
[1m expected_and_none = frozenset.union(expected_side, [None])[0m
[1m assert expected_and_none == set(first), first[0m
[1m assert expected_and_none == set(second), second[0m
[1m [0m
[1m pipeline = self.create_pipeline()[0m
[1m pcol = pipeline | 'start' >> beam.Create(['A', 'B'])[0m
[1m side = pipeline | 'side' >> beam.Create(expected_side)[0m
[1m _ = pcol | 'check' >> beam.Map(check_reiteration, beam.pvalue.AsIter(side))[0m
[1m> pipeline.run()[0m
[1m[31mapache_beam/transforms/sideinputs_test.py[0m:220:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7f2abddaeaf0>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/bin/python3.8',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:371 Make sure that locally built Python SDK docker image has Python 3.8 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.8_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:295 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:302 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/six-1.15.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/six-1.15.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:655 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911182302-579143.1631384582.579296/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210911182302580364-3550'
createTime: '2021-09-11T18:23:09.834201Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-11_11_23_08-5693468871190837961'
location: 'us-central1'
name: 'beamapp-jenkins-0911182302-579143'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-11T18:23:09.834201Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-11_11_23_08-5693468871190837961]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-11_11_23_08-5693468871190837961
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:823 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_11_23_08-5693468871190837961?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m========= 26 failed, 3 passed, 3 skipped, 8 warnings in 176.71 seconds =========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 23m 2s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/xqlfj7ghnn2im
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8375
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8375/display/redirect>
Changes:
------------------------------------------
[...truncated 7.43 MB...]
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:39:47.851Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:39:47.872Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:39:59.991Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:00.015Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:11.881Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:11.905Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:25.044Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:25.070Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:37.688Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:37.707Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:51.021Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:40:51.041Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:03.754Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:03.776Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:16.492Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:16.517Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:29.518Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:29.544Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:41.545Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:41.569Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:54.705Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:41:54.725Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:07.126Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:07.148Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:20.003Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:20.028Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:33.536Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:33.555Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:46.606Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:46.642Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:59.177Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:42:59.220Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:12.026Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:12.050Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:24.512Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:24.531Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:37.215Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:37.241Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:50.189Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:43:50.211Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:03.125Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:03.155Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:16.253Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:16.279Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:29.267Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:29.285Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:42.784Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:42.803Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:55.600Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:44:55.623Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:08.376Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:08.392Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:21.070Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:21.105Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:34.134Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:34.159Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:46.652Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:46.690Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:59.847Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:45:59.872Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:13.749Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:13.772Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:26.896Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:26.916Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:39.406Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:39.425Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:52.055Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:46:52.082Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:04.953Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:04.980Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:18.067Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:18.093Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:31.692Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:31.716Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:45.002Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:45.026Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:57.491Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:47:57.515Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:11.067Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:11.092Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a 401 (attempt 1/2)
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a 401 (attempt 1/2)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:24.412Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:24.437Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:36.916Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:36.932Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:49.759Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:48:49.789Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:02.732Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:02.791Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:15.882Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:15.902Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:28.863Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:28.883Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:42.037Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:42.053Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:54.497Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:49:54.520Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:07.740Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:07.762Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:20.268Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:20.285Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:33.008Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:33.031Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:45.415Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:45.440Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:58.002Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:50:58.021Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:11.129Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:11.155Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:23.867Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:23.907Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:36.712Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:36.740Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:49.500Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:51:49.520Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:02.394Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:02.417Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:15.297Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:15.314Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:28.502Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:28.523Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:40.639Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:40.658Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:53.334Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:52:53.359Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:05.852Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:05.874Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.591Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2021-09-11_06_38_23-10340177224615320777.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.611Z: JOB_MESSAGE_BASIC: Finished operation side/Impulse+side/FlatMap(<lambda at core.py:2965>)+side/MaybeReshuffle/Reshuffle/AddRandomKeys+side/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.611Z: JOB_MESSAGE_BASIC: Finished operation compute/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream+compute/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets+compute/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/Values+compute/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.612Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2965>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.612Z: JOB_MESSAGE_BASIC: Finished operation start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+start/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+start/MaybeReshuffle/Reshuffle/RemoveRandomKeys+start/Map(decode)+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.612Z: JOB_MESSAGE_BASIC: Finished operation side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+side/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+side/MaybeReshuffle/Reshuffle/RemoveRandomKeys+side/Map(decode)+compute/MapToVoidKey0+compute/MapToVoidKey0+compute/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/PairWithVoidKey+compute/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.612Z: JOB_MESSAGE_BASIC: Finished operation start/Impulse+start/FlatMap(<lambda at core.py:2965>)+start/MaybeReshuffle/Reshuffle/AddRandomKeys+start/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T14:53:15.612Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-09-11_06_38_23-10340177224615320777 is in state JOB_STATE_CANCELLING
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m======== 17 failed, 12 passed, 3 skipped, 8 warnings in 9094.27 seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 53m 17s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/5tetkvlpbixrg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8374
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8374/display/redirect>
Changes:
------------------------------------------
[...truncated 4.05 MB...]
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:37:55.642Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:37:55.667Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:08.438Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:08.461Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:22.593Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:22.628Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:35.782Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:35.797Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:49.207Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:38:49.234Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:03.077Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:03.096Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:16.519Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:16.542Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:29.672Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:29.704Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:43.154Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:43.185Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:56.961Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:39:56.983Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:10.540Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:10.563Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:23.474Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:23.500Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:37.021Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:37.044Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:50.376Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:40:50.403Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:03.846Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:03.870Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:16.505Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:16.525Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:30.446Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:30.468Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:44.065Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:44.091Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:57.219Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:41:57.240Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:09.816Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:09.837Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:23.331Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:23.354Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:37.593Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:37.619Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:50.769Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:42:50.798Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:04.976Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:04.999Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:18.331Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:18.354Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:31.421Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:31.439Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:44.496Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:44.522Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:58.456Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:43:58.477Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:12.564Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:12.583Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:26.791Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:26.808Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:40.947Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:40.974Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:54.248Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:44:54.279Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:07.154Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:07.174Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:20.339Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:20.366Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:33.184Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:33.209Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:47.345Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:45:47.369Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:01.615Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:01.640Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:15.499Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:15.520Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:30.244Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:30.262Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:43.681Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:43.705Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:56.849Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:46:56.866Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:10.925Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:10.953Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:23.836Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:23.860Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:37.827Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:37.853Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:50.943Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:47:50.981Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:04.496Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:04.529Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:17.262Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:17.293Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:30.652Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:30.680Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:44.585Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:44.606Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:56.619Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:48:56.651Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:09.325Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:09.347Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:22.535Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:22.567Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:35.491Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:35.513Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:48.131Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:49:48.160Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:00.693Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:00.728Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:12.802Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:12.831Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:26.866Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:26.888Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:39.638Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:39.669Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:51.884Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:50:51.915Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:04.698Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:04.753Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:17.056Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:17.081Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:30.051Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:30.070Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:42.325Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:42.380Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:54.497Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:51:54.515Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:07.155Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:07.190Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:19.865Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:19.896Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:32.608Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:32.639Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:45.219Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-11T07:52:45.243Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-09-10_23_37_02-2756092287675935300 is in state JOB_STATE_CANCELLING
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m======== 13 failed, 16 passed, 3 skipped, 8 warnings in 5805.15 seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 2m 45s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/gndemy5hnrk66
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org