You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/09/12 07:44:14 UTC
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8378
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8378/display/redirect>
Changes:
------------------------------------------
[...truncated 2.98 MB...]
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:28.384Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:41.868Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:41.886Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:54.516Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:18:54.539Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:07.340Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:07.360Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:20.968Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:20.997Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:33.804Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:33.827Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:46.449Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:46.470Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:59.913Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:19:59.935Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:13.359Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:13.379Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:26.811Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:26.833Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:39.402Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:39.420Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:52.021Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:20:52.044Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:05.637Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:05.662Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:18.842Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:18.862Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:32.529Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:32.545Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:46.353Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:46.370Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:58.873Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:21:58.889Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:12.485Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:12.502Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:25.778Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:25.804Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:39.114Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:39.135Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:52.952Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:22:52.972Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:05.796Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:05.821Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:17.962Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:17.982Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:30.732Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:30.754Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:43.857Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:43.877Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:57.025Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:23:57.044Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:10.203Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:10.245Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:23.670Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:23.694Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:37.525Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:37.541Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:51.164Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:24:51.184Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:03.926Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:03.945Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:16.870Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:16.889Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:29.068Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:29.091Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:42.027Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:42.053Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:54.951Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:25:54.975Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:08.114Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:08.137Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:21.714Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:21.737Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:35.078Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:35.098Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:47.731Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:26:47.752Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:00.491Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:00.508Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:13.401Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:13.424Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:26.585Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:26.612Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:39.605Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:39.622Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:52.819Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:27:52.836Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:06.616Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:06.642Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:19.899Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:19.920Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:32.530Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:32.548Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:45.582Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:45.609Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:59.078Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:28:59.093Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:12.811Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:12.836Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:26.510Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:26.536Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:39.416Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:39.434Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:52.595Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:29:52.632Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:06.920Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:06.944Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:20.006Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:20.033Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:33.953Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:33.973Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:47.337Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:30:47.359Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:00.055Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:00.077Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:13.480Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:13.508Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:26.655Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:26.679Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:39.868Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:39.888Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:52.657Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:31:52.677Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:05.606Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:05.628Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:18.563Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:18.583Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:31.627Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:31.650Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:45.086Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:45.108Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:58.288Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:32:58.308Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:11.201Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:11.224Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:24.251Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 2500.0 in region us-central1.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:24.290Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.317Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2021-09-11_23_18_52-6940385899211472512.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Some Numbers/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Some Numbers/Map(decode)+ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:256>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream+assert:odd/WindowInto(WindowIntoFn)+assert:odd/ToVoidKey+assert:odd/Group/CoGroupByKeyImpl/Tag[1]+assert:odd/Group/CoGroupByKeyImpl/GroupByKey/WriteStream+assert:even/WindowInto(WindowIntoFn)+assert:even/ToVoidKey+assert:even/Group/CoGroupByKeyImpl/Tag[1]+assert:even/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation assert:even/Create/Impulse+assert:even/Create/FlatMap(<lambda at core.py:2965>)+assert:even/Create/Map(decode)+assert:even/Group/CoGroupByKeyImpl/Tag[0]+assert:even/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.340Z: JOB_MESSAGE_BASIC: Finished operation assert:even/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert:even/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert:even/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert:even/Group/RestoreTags+assert:even/Unkey+assert:even/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2965>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation assert:odd/Create/Impulse+assert:odd/Create/FlatMap(<lambda at core.py:2965>)+assert:odd/Create/Map(decode)+assert:odd/Group/CoGroupByKeyImpl/Tag[0]+assert:odd/Group/CoGroupByKeyImpl/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation Some Numbers/Impulse+Some Numbers/FlatMap(<lambda at core.py:2965>)+Some Numbers/MaybeReshuffle/Reshuffle/AddRandomKeys+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Some Numbers/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.343Z: JOB_MESSAGE_BASIC: Finished operation assert:odd/Group/CoGroupByKeyImpl/GroupByKey/ReadStream+assert:odd/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets+assert:odd/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert:odd/Group/RestoreTags+assert:odd/Unkey+assert:odd/Match
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.433Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.482Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-09-12T07:33:34.503Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-09-11_23_18_52-6940385899211472512 is in state JOB_STATE_CANCELLING
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m============== 25 failed, 4 passed, 3 skipped in 5180.95 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 43m 41s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/qca3daph5aztm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow
#8385
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8385/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8384
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8384/display/redirect?page=changes>
Changes:
[rohde.samuel] [BEAM-12842] Add timestamp to test work item to deflake
[suztomo] [BEAM-12873] HL7v2IO: to leave schematizedData null, not empty
------------------------------------------
[...truncated 2.34 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181637-393752.1631556997.394062/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913181637395747-3550'
createTime: '2021-09-13T18:16:45.557152Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-13_11_16_44-5357630715320491649'
location: 'us-central1'
name: 'beamapp-jenkins-0913181637-393752'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T18:16:45.557152Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-13_11_16_44-5357630715320491649]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-13_11_16_44-5357630715320491649
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_11_16_44-5357630715320491649?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m__________________ SideInputsTest.test_reiterable_side_input ___________________[0m
[gw3] linux -- Python 3.6.8 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6>
self = <apache_beam.transforms.sideinputs_test.SideInputsTest testMethod=test_reiterable_side_input>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reiterable_side_input(self):[0m
[1m expected_side = frozenset(range(100))[0m
[1m [0m
[1m def check_reiteration(main, side):[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate a second time.[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate over two copies of the input at the same time.[0m
[1m both = zip(side, side)[0m
[1m first, second = zip(*both)[0m
[1m assert expected_side == set(first), first[0m
[1m assert expected_side == set(second), second[0m
[1m # This will iterate over two copies of the side input, but offset.[0m
[1m offset = [None] * (len(expected_side) // 2)[0m
[1m both = zip(itertools.chain(side, offset), itertools.chain(offset, side))[0m
[1m first, second = zip(*both)[0m
[1m expected_and_none = frozenset.union(expected_side, [None])[0m
[1m assert expected_and_none == set(first), first[0m
[1m assert expected_and_none == set(second), second[0m
[1m [0m
[1m pipeline = self.create_pipeline()[0m
[1m pcol = pipeline | 'start' >> beam.Create(['A', 'B'])[0m
[1m side = pipeline | 'side' >> beam.Create(expected_side)[0m
[1m _ = pcol | 'check' >> beam.Map(check_reiteration, beam.pvalue.AsIter(side))[0m
[1m> pipeline.run()[0m
[1m[31mapache_beam/transforms/sideinputs_test.py[0m:220:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7f83aad18c18>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913181639-716480.1631556999.716741/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913181639718234-3550'
createTime: '2021-09-13T18:16:47.729962Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-13_11_16_46-11063527506877281827'
location: 'us-central1'
name: 'beamapp-jenkins-0913181639-716480'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T18:16:47.729962Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-13_11_16_46-11063527506877281827]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-13_11_16_46-11063527506877281827
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_11_16_46-11063527506877281827?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py36-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 210.65 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 16m 23s
92 actionable tasks: 63 executed, 29 from cache
Publishing build scan...
https://gradle.com/s/nl6ak6r4bjbta
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8383
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8383/display/redirect>
Changes:
------------------------------------------
[...truncated 2.17 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_10_13-17203889701883546549?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw2] linux -- Python 3.6.8 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7fb11e0b5198>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.15.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.15.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/dataflow-worker.jar in 3 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913121010-251025.1631535010.251221/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913121010252407-3550'
createTime: '2021-09-13T12:10:20.424677Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-13_05_10_15-13873130178695542247'
location: 'us-central1'
name: 'beamapp-jenkins-0913121010-251025'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T12:10:20.424677Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-13_05_10_15-13873130178695542247]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-13_05_10_15-13873130178695542247
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_10_15-13873130178695542247?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py36-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 162.40 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 10m 14s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/ioygd5z7coubc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8382
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8382/display/redirect>
Changes:
------------------------------------------
[...truncated 2.63 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_23_16_07-5737291489510279557?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw0] linux -- Python 3.7.3 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7ff42e7b4a90>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.15.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.15.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913061602-178589.1631513762.178767/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913061602180174-3550'
createTime: '2021-09-13T06:16:09.247582Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_23_16_08-7973199312937205633'
location: 'us-central1'
name: 'beamapp-jenkins-0913061602-178589'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T06:16:09.247582Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_23_16_08-7973199312937205633]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_23_16_08-7973199312937205633
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_23_16_08-7973199312937205633?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 127.33 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 15m 50s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/xbzhzwjmpdjjg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8381
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8381/display/redirect>
Changes:
------------------------------------------
[...truncated 2.32 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-1.3.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-1.3.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/dataflow-worker.jar in 3 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001745-817515.1631492265.817683/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913001745818666-3550'
createTime: '2021-09-13T00:17:52.182401Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_17_17_51-3112782731391603756'
location: 'us-central1'
name: 'beamapp-jenkins-0913001745-817515'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T00:17:52.182401Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_17_17_51-3112782731391603756]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_17_17_51-3112782731391603756
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_17_17_51-3112782731391603756?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m__________________ SideInputsTest.test_reiterable_side_input ___________________[0m
[gw5] linux -- Python 3.7.3 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.transforms.sideinputs_test.SideInputsTest testMethod=test_reiterable_side_input>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reiterable_side_input(self):[0m
[1m expected_side = frozenset(range(100))[0m
[1m [0m
[1m def check_reiteration(main, side):[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate a second time.[0m
[1m assert expected_side == set(side), side[0m
[1m # Iterate over two copies of the input at the same time.[0m
[1m both = zip(side, side)[0m
[1m first, second = zip(*both)[0m
[1m assert expected_side == set(first), first[0m
[1m assert expected_side == set(second), second[0m
[1m # This will iterate over two copies of the side input, but offset.[0m
[1m offset = [None] * (len(expected_side) // 2)[0m
[1m both = zip(itertools.chain(side, offset), itertools.chain(offset, side))[0m
[1m first, second = zip(*both)[0m
[1m expected_and_none = frozenset.union(expected_side, [None])[0m
[1m assert expected_and_none == set(first), first[0m
[1m assert expected_and_none == set(second), second[0m
[1m [0m
[1m pipeline = self.create_pipeline()[0m
[1m pcol = pipeline | 'start' >> beam.Create(['A', 'B'])[0m
[1m side = pipeline | 'side' >> beam.Create(expected_side)[0m
[1m _ = pcol | 'check' >> beam.Map(check_reiteration, beam.pvalue.AsIter(side))[0m
[1m> pipeline.run()[0m
[1m[31mapache_beam/transforms/sideinputs_test.py[0m:220:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7f8fc99d6e48>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-1.3.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-1.3.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/dataflow-worker.jar in 3 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0913001743-486227.1631492263.486373/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210913001743487351-3550'
createTime: '2021-09-13T00:17:50.073629Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_17_17_49-16632517925545001729'
location: 'us-central1'
name: 'beamapp-jenkins-0913001743-486227'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-13T00:17:50.073629Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_17_17_49-16632517925545001729]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_17_17_49-16632517925545001729
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_17_17_49-16632517925545001729?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 101.46 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 17m 36s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/5zkw2ijjlyens
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8380
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8380/display/redirect>
Changes:
------------------------------------------
[...truncated 2.47 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_11_10_50-2277753668293080389?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw3] linux -- Python 3.7.3 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7f64dfcda518>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.15.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.15.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912181058-249687.1631470258.249923/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210912181058251599-3550'
createTime: '2021-09-12T18:11:05.242398Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_11_11_04-6391030904901470593'
location: 'us-central1'
name: 'beamapp-jenkins-0912181058-249687'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-12T18:11:05.242398Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_11_11_04-6391030904901470593]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_11_11_04-6391030904901470593
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_11_11_04-6391030904901470593?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 146.88 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 10m 52s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/kzzi2dz6lvo44
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #8379
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8379/display/redirect>
Changes:
------------------------------------------
[...truncated 2.17 MB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_13_02-6252461799824349473?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
[31m[1m______________ ReshuffleTest.test_reshuffle_preserves_timestamps _______________[0m
[gw0] linux -- Python 3.6.8 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6>
self = <apache_beam.transforms.util_test.ReshuffleTest testMethod=test_reshuffle_preserves_timestamps>
[1m @pytest.mark.it_validatesrunner[0m
[1m def test_reshuffle_preserves_timestamps(self):[0m
[1m with TestPipeline() as pipeline:[0m
[1m [0m
[1m # Create a PCollection and assign each element with a different timestamp.[0m
[1m before_reshuffle = ([0m
[1m pipeline[0m
[1m | beam.Create([[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': MIN_TIMESTAMP[0m
[1m },[0m
[1m {[0m
[1m 'name': 'foo', 'timestamp': 0[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 33[0m
[1m },[0m
[1m {[0m
[1m 'name': 'bar', 'timestamp': 0[0m
[1m },[0m
[1m ])[0m
[1m | beam.Map([0m
[1m lambda element: beam.window.TimestampedValue([0m
[1m element, element['timestamp'])))[0m
[1m [0m
[1m # Reshuffle the PCollection above and assign the timestamp of an element[0m
[1m # to that element again.[0m
[1m after_reshuffle = before_reshuffle | beam.Reshuffle()[0m
[1m [0m
[1m # Given an element, emits a string which contains the timestamp and the[0m
[1m # name field of the element.[0m
[1m def format_with_timestamp(element, timestamp=beam.DoFn.TimestampParam):[0m
[1m t = str(timestamp)[0m
[1m if timestamp == MIN_TIMESTAMP:[0m
[1m t = 'MIN_TIMESTAMP'[0m
[1m elif timestamp == MAX_TIMESTAMP:[0m
[1m t = 'MAX_TIMESTAMP'[0m
[1m return '{} - {}'.format(t, element['name'])[0m
[1m [0m
[1m # Combine each element in before_reshuffle with its timestamp.[0m
[1m formatted_before_reshuffle = ([0m
[1m before_reshuffle[0m
[1m | "Get before_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m # Combine each element in after_reshuffle with its timestamp.[0m
[1m formatted_after_reshuffle = ([0m
[1m after_reshuffle[0m
[1m | "Get after_reshuffle timestamp" >> beam.Map(format_with_timestamp))[0m
[1m [0m
[1m expected_data = [[0m
[1m 'MIN_TIMESTAMP - foo',[0m
[1m 'Timestamp(0) - foo',[0m
[1m 'Timestamp(33) - bar',[0m
[1m 'Timestamp(0) - bar'[0m
[1m ][0m
[1m [0m
[1m # Can't compare formatted_before_reshuffle and formatted_after_reshuffle[0m
[1m # directly, because they are deferred PCollections while equal_to only[0m
[1m # takes a concrete argument.[0m
[1m assert_that([0m
[1m formatted_before_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m label="formatted_before_reshuffle")[0m
[1m assert_that([0m
[1m formatted_after_reshuffle,[0m
[1m equal_to(expected_data),[0m
[1m> label="formatted_after_reshuffle")[0m
[1m[31mapache_beam/transforms/util_test.py[0m:614:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:587: in __exit__
[1m self.result = self.run()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.testing.test_pipeline.TestPipeline object at 0x7fd6a75d7cc0>
test_runner_api = True
[1m def run(self, test_runner_api=True):[0m
[1m result = super(TestPipeline, self).run([0m
[1m test_runner_api=([0m
[1m False if self.not_use_test_runner_api else test_runner_api))[0m
[1m if self.blocking:[0m
[1m state = result.wait_until_finish()[0m
[1m> assert state in (PipelineState.DONE, PipelineState.CANCELLED), \[0m
[1m "Pipeline execution failed."[0m
[1m[31mE AssertionError: Pipeline execution failed.[0m
[1m[31mE assert 'FAILED' in ('DONE', 'CANCELLED')[0m
[1m[31mapache_beam/testing/test_pipeline.py[0m:117: AssertionError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:644 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python3.6',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[33mWARNING [0m root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker environment
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.5.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.5.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.6.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pbr-5.6.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/soupsieve-2.2.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.9.3.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/beautifulsoup4-4.10.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/dataflow-worker.jar in 4 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0912121258-043875.1631448778.044149/pipeline.pb in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20', '--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 Create job: <Job
clientRequestId: '20210912121258069056-3550'
createTime: '2021-09-12T12:13:07.246978Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-09-12_05_13_05-10764178939461324569'
location: 'us-central1'
name: 'beamapp-jenkins-0912121258-043875'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-09-12T12:13:07.246978Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Created job with id: [2021-09-12_05_13_05-10764178939461324569]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 Submitted job: 2021-09-12_05_13_05-10764178939461324569
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_13_05-10764178939461324569?project=apache-beam-testing
[33mWARNING [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:64 Waiting indefinitely for streaming job.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py36-xdist.xml> -
[31m[1m=============== 26 failed, 3 passed, 3 skipped in 201.03 seconds ===============[0m
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 149
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 181
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 12m 49s
92 actionable tasks: 62 executed, 30 from cache
Publishing build scan...
https://gradle.com/s/msity5levn6qc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org